# STDOUT: ---v---v---v---v---v--- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = ['/home/jenkins/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /opt/ansible-2.9/lib/python3.6/site-packages/ansible executable location = /opt/ansible-2.9/bin/ansible-playbook python version = 3.6.8 (default, Jan 25 2023, 15:03:30) [GCC 8.5.0 20210514 (Red Hat 8.5.0-18)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_create_partition_volume_then_remove_scsi_generated.yml ********* 2 plays in /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/tests_create_partition_volume_then_remove_scsi_generated.yml PLAY [Run test tests_create_partition_volume_then_remove.yml for scsi] ********* TASK [Gathering Facts] ********************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/tests_create_partition_volume_then_remove_scsi_generated.yml:3 Wednesday 31 May 2023 18:09:40 +0000 (0:00:00.016) 0:00:00.016 ********* ok: [sut] META: ran handlers TASK [Set disk interface for test] ********************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/tests_create_partition_volume_then_remove_scsi_generated.yml:8 Wednesday 31 May 2023 18:09:41 +0000 (0:00:00.819) 0:00:00.835 ********* ok: [sut] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [Test partition volume creation and remove] ******************************* TASK [Gathering Facts] ********************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/tests_create_partition_volume_then_remove.yml:2 Wednesday 31 May 2023 18:09:41 +0000 (0:00:00.016) 0:00:00.852 ********* ok: [sut] META: ran handlers TASK [Run the role] ************************************************************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/tests_create_partition_volume_then_remove.yml:10 Wednesday 31 May 2023 18:09:41 +0000 (0:00:00.514) 0:00:01.367 ********* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 31 May 2023 18:09:41 +0000 (0:00:00.024) 0:00:01.391 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 31 May 2023 18:09:41 +0000 (0:00:00.034) 0:00:01.426 ********* ok: [sut] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 31 May 2023 18:09:42 +0000 (0:00:00.342) 0:00:01.768 ********* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 31 May 2023 18:09:42 +0000 (0:00:00.048) 0:00:01.816 ********* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 31 May 2023 18:09:42 +0000 (0:00:00.012) 0:00:01.829 ********* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 31 May 2023 18:09:42 +0000 (0:00:00.012) 0:00:01.842 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 31 May 2023 18:09:42 +0000 (0:00:00.042) 0:00:01.884 ********* changed: [sut] => { "changed": true, "rc": 0, "results": [ "Installed: python3-blivet-1:3.4.4-1.fc36.noarch", "Installed: python3-blockdev-2.28-2.fc36.x86_64", "Installed: python3-bytesize-2.7-1.fc36.x86_64", "Installed: device-mapper-event-1.02.175-7.fc36.x86_64", "Installed: libblockdev-btrfs-2.28-2.fc36.x86_64", "Installed: lzo-2.10-6.fc36.x86_64", "Installed: device-mapper-event-libs-1.02.175-7.fc36.x86_64", "Installed: sgpio-1.2.0.10-30.fc36.x86_64", "Installed: device-mapper-persistent-data-0.9.0-7.fc36.x86_64", "Installed: python3-pyparted-1:3.12.0-1.fc36.x86_64", "Installed: libblockdev-dm-2.28-2.fc36.x86_64", "Installed: lvm2-2.03.11-7.fc36.x86_64", "Installed: cxl-libs-76.1-1.fc36.x86_64", "Installed: lvm2-libs-2.03.11-7.fc36.x86_64", "Installed: libblockdev-kbd-2.28-2.fc36.x86_64", "Installed: blivet-data-1:3.4.4-1.fc36.noarch", "Installed: libblockdev-lvm-2.28-2.fc36.x86_64", "Installed: libblockdev-mpath-2.28-2.fc36.x86_64", "Installed: libblockdev-nvdimm-2.28-2.fc36.x86_64", "Installed: ndctl-76.1-1.fc36.x86_64", "Installed: lsof-4.94.0-3.fc36.x86_64", "Installed: device-mapper-multipath-0.8.7-9.fc36.x86_64", "Installed: bcache-tools-1.1-2.fc36.x86_64", "Installed: ndctl-libs-76.1-1.fc36.x86_64", "Installed: device-mapper-multipath-libs-0.8.7-9.fc36.x86_64", "Installed: daxctl-libs-76.1-1.fc36.x86_64", "Installed: btrfs-progs-6.2.2-1.fc36.x86_64", "Installed: dmraid-1.0.0.rc16-52.fc36.x86_64", "Installed: dmraid-events-1.0.0.rc16-52.fc36.x86_64", "Installed: dmraid-libs-1.0.0.rc16-52.fc36.x86_64", "Installed: libaio-0.3.111-13.fc36.x86_64", "Installed: iniparser-4.1-9.fc36.x86_64" ] } TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:9 Wednesday 31 May 2023 18:09:52 +0000 (0:00:10.560) 0:00:12.445 ********* ok: [sut] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 31 May 2023 18:09:52 +0000 (0:00:00.027) 0:00:12.473 ********* ok: [sut] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 31 May 2023 18:09:52 +0000 (0:00:00.026) 0:00:12.499 ********* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:32 Wednesday 31 May 2023 18:09:53 +0000 (0:00:00.624) 0:00:13.124 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 31 May 2023 18:09:53 +0000 (0:00:00.048) 0:00:13.173 ********* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 31 May 2023 18:09:53 +0000 (0:00:00.023) 0:00:13.196 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 31 May 2023 18:09:53 +0000 (0:00:00.015) 0:00:13.212 ********* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:39 Wednesday 31 May 2023 18:09:53 +0000 (0:00:00.026) 0:00:13.238 ********* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:46 Wednesday 31 May 2023 18:09:55 +0000 (0:00:02.438) 0:00:15.677 ********* ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "inactive", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:53 Wednesday 31 May 2023 18:09:58 +0000 (0:00:02.183) 0:00:17.860 ********* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:67 Wednesday 31 May 2023 18:09:58 +0000 (0:00:00.034) 0:00:17.894 ********* TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:73 Wednesday 31 May 2023 18:09:58 +0000 (0:00:00.017) 0:00:17.911 ********* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:87 Wednesday 31 May 2023 18:09:58 +0000 (0:00:00.357) 0:00:18.269 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 31 May 2023 18:09:58 +0000 (0:00:00.047) 0:00:18.316 ********* TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:105 Wednesday 31 May 2023 18:09:58 +0000 (0:00:00.017) 0:00:18.333 ********* ok: [sut] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Wednesday 31 May 2023 18:09:58 +0000 (0:00:00.023) 0:00:18.357 ********* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 31 May 2023 18:09:58 +0000 (0:00:00.022) 0:00:18.380 ********* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:130 Wednesday 31 May 2023 18:09:58 +0000 (0:00:00.029) 0:00:18.409 ********* TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:142 Wednesday 31 May 2023 18:09:58 +0000 (0:00:00.018) 0:00:18.428 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:147 Wednesday 31 May 2023 18:09:58 +0000 (0:00:00.020) 0:00:18.449 ********* TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:159 Wednesday 31 May 2023 18:09:58 +0000 (0:00:00.018) 0:00:18.467 ********* TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:174 Wednesday 31 May 2023 18:09:58 +0000 (0:00:00.017) 0:00:18.484 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:182 Wednesday 31 May 2023 18:09:58 +0000 (0:00:00.023) 0:00:18.508 ********* ok: [sut] => { "changed": false, "stat": { "atime": 1685556054.43803, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1684244424.757, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131081, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1684244183.529, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3816983141", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:187 Wednesday 31 May 2023 18:09:59 +0000 (0:00:00.264) 0:00:18.773 ********* TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:209 Wednesday 31 May 2023 18:09:59 +0000 (0:00:00.019) 0:00:18.792 ********* ok: [sut] TASK [Mark tasks to be skipped] ************************************************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/tests_create_partition_volume_then_remove.yml:14 Wednesday 31 May 2023 18:09:59 +0000 (0:00:00.613) 0:00:19.406 ********* ok: [sut] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [Get unused disks] ******************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/tests_create_partition_volume_then_remove.yml:21 Wednesday 31 May 2023 18:09:59 +0000 (0:00:00.017) 0:00:19.423 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/get_unused_disk.yml for sut TASK [Find unused disks in the system] ***************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/get_unused_disk.yml:2 Wednesday 31 May 2023 18:09:59 +0000 (0:00:00.027) 0:00:19.451 ********* ok: [sut] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/get_unused_disk.yml:10 Wednesday 31 May 2023 18:10:00 +0000 (0:00:00.266) 0:00:19.718 ********* ok: [sut] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/get_unused_disk.yml:15 Wednesday 31 May 2023 18:10:00 +0000 (0:00:00.023) 0:00:19.741 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/get_unused_disk.yml:20 Wednesday 31 May 2023 18:10:00 +0000 (0:00:00.020) 0:00:19.761 ********* ok: [sut] => { "unused_disks": [ "sda" ] } TASK [Create a partition device mounted on /opt/test1] ************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/tests_create_partition_volume_then_remove.yml:26 Wednesday 31 May 2023 18:10:00 +0000 (0:00:00.022) 0:00:19.783 ********* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 31 May 2023 18:10:00 +0000 (0:00:00.039) 0:00:19.823 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 31 May 2023 18:10:00 +0000 (0:00:00.029) 0:00:19.853 ********* ok: [sut] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 31 May 2023 18:10:00 +0000 (0:00:00.345) 0:00:20.199 ********* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 31 May 2023 18:10:00 +0000 (0:00:00.048) 0:00:20.247 ********* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 31 May 2023 18:10:00 +0000 (0:00:00.018) 0:00:20.265 ********* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 31 May 2023 18:10:00 +0000 (0:00:00.018) 0:00:20.284 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 31 May 2023 18:10:00 +0000 (0:00:00.043) 0:00:20.328 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:9 Wednesday 31 May 2023 18:10:00 +0000 (0:00:00.020) 0:00:20.349 ********* ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "sda", "type": "partition", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 31 May 2023 18:10:00 +0000 (0:00:00.028) 0:00:20.377 ********* ok: [sut] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 31 May 2023 18:10:00 +0000 (0:00:00.020) 0:00:20.398 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:32 Wednesday 31 May 2023 18:10:00 +0000 (0:00:00.020) 0:00:20.418 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:39 Wednesday 31 May 2023 18:10:00 +0000 (0:00:00.019) 0:00:20.438 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:46 Wednesday 31 May 2023 18:10:00 +0000 (0:00:00.018) 0:00:20.456 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:53 Wednesday 31 May 2023 18:10:00 +0000 (0:00:00.018) 0:00:20.474 ********* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:67 Wednesday 31 May 2023 18:10:00 +0000 (0:00:00.034) 0:00:20.508 ********* TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:73 Wednesday 31 May 2023 18:10:00 +0000 (0:00:00.017) 0:00:20.525 ********* changed: [sut] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/sda1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "state": "mounted" } ], "packages": [ "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:87 Wednesday 31 May 2023 18:10:04 +0000 (0:00:03.612) 0:00:24.138 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 31 May 2023 18:10:04 +0000 (0:00:00.020) 0:00:24.159 ********* TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:105 Wednesday 31 May 2023 18:10:04 +0000 (0:00:00.016) 0:00:24.175 ********* ok: [sut] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/sda1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "state": "mounted" } ], "packages": [ "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Wednesday 31 May 2023 18:10:04 +0000 (0:00:00.022) 0:00:24.198 ********* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 31 May 2023 18:10:04 +0000 (0:00:00.023) 0:00:24.222 ********* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:130 Wednesday 31 May 2023 18:10:04 +0000 (0:00:00.022) 0:00:24.245 ********* TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:142 Wednesday 31 May 2023 18:10:04 +0000 (0:00:00.025) 0:00:24.270 ********* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:147 Wednesday 31 May 2023 18:10:05 +0000 (0:00:00.759) 0:00:25.030 ********* changed: [sut] => (item={'src': 'UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0', 'path': '/opt/test1', 'fstype': 'ext4', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0" } TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:159 Wednesday 31 May 2023 18:10:05 +0000 (0:00:00.309) 0:00:25.340 ********* skipping: [sut] => (item={'src': 'UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0', 'path': '/opt/test1', 'fstype': 'ext4', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:174 Wednesday 31 May 2023 18:10:05 +0000 (0:00:00.026) 0:00:25.366 ********* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:182 Wednesday 31 May 2023 18:10:06 +0000 (0:00:00.597) 0:00:25.964 ********* ok: [sut] => { "changed": false, "stat": { "atime": 1685556054.43803, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1684244424.757, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131081, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1684244183.529, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3816983141", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:187 Wednesday 31 May 2023 18:10:06 +0000 (0:00:00.206) 0:00:26.171 ********* TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:209 Wednesday 31 May 2023 18:10:06 +0000 (0:00:00.017) 0:00:26.189 ********* ok: [sut] TASK [Verify role results] ***************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/tests_create_partition_volume_then_remove.yml:40 Wednesday 31 May 2023 18:10:07 +0000 (0:00:00.616) 0:00:26.806 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml for sut TASK [Print out pool information] ********************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:2 Wednesday 31 May 2023 18:10:07 +0000 (0:00:00.032) 0:00:26.838 ********* ok: [sut] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:7 Wednesday 31 May 2023 18:10:07 +0000 (0:00:00.024) 0:00:26.863 ********* skipping: [sut] => {} TASK [Collect info about the volumes.] ***************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:15 Wednesday 31 May 2023 18:10:07 +0000 (0:00:00.018) 0:00:26.881 ********* ok: [sut] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "ext4", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "a1a80642-183f-4413-8c90-88d0e56cb0c0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "f91a7ec7-5021-4d03-b280-c7f5e8053b5f" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:20 Wednesday 31 May 2023 18:10:07 +0000 (0:00:00.282) 0:00:27.164 ********* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003590", "end": "2023-05-31 18:10:07.728015", "rc": 0, "start": "2023-05-31 18:10:07.724425" } STDOUT: # # /etc/fstab # Created by anaconda on Tue May 16 13:36:23 2023 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=f91a7ec7-5021-4d03-b280-c7f5e8053b5f / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0 /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:25 Wednesday 31 May 2023 18:10:07 +0000 (0:00:00.283) 0:00:27.447 ********* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003539", "end": "2023-05-31 18:10:07.933478", "failed_when_result": false, "rc": 0, "start": "2023-05-31 18:10:07.929939" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:34 Wednesday 31 May 2023 18:10:07 +0000 (0:00:00.204) 0:00:27.651 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool.yml for sut TASK [Set _storage_pool_tests] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool.yml:5 Wednesday 31 May 2023 18:10:07 +0000 (0:00:00.038) 0:00:27.690 ********* ok: [sut] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Verify pool subset] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool.yml:18 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.018) 0:00:27.708 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-volumes.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:2 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.040) 0:00:27.748 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:13 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.020) 0:00:27.769 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:22 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.041) 0:00:27.810 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:27 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.018) 0:00:27.829 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:33 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.018) 0:00:27.847 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:42 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.018) 0:00:27.865 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:48 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.017) 0:00:27.883 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:54 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.018) 0:00:27.901 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:59 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.020) 0:00:27.922 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check MD RAID] *********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:73 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.020) 0:00:27.942 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml for sut TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:8 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.035) 0:00:27.978 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:14 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.018) 0:00:27.996 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:21 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.018) 0:00:28.015 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:28 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.018) 0:00:28.033 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:35 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.018) 0:00:28.051 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:45 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.018) 0:00:28.069 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:55 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.018) 0:00:28.088 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:66 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.017) 0:00:28.106 ********* ok: [sut] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:76 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.019) 0:00:28.125 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-lvmraid.yml for sut TASK [Validate pool member LVM RAID settings] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-lvmraid.yml:2 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.036) 0:00:28.162 ********* skipping: [sut] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'ext4', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': 0, 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:79 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.026) 0:00:28.188 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-thin.yml for sut TASK [Validate pool member thinpool settings] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-thin.yml:2 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.040) 0:00:28.229 ********* skipping: [sut] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'ext4', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': 0, 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:82 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.025) 0:00:28.255 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-encryption.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-encryption.yml:5 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.045) 0:00:28.300 ********* ok: [sut] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-encryption.yml:13 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.024) 0:00:28.324 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-encryption.yml:20 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.020) 0:00:28.344 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-encryption.yml:27 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.020) 0:00:28.365 ********* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:85 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.018) 0:00:28.383 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-vdo.yml for sut TASK [Validate pool member VDO settings] *************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-vdo.yml:2 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.040) 0:00:28.424 ********* skipping: [sut] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'ext4', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': 0, 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Clean up test variables] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:88 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.024) 0:00:28.449 ********* ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-volumes.yml:3 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.017) 0:00:28.467 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume.yml for sut TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume.yml:2 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.035) 0:00:28.502 ********* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume.yml:21 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.022) 0:00:28.525 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:7 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.091) 0:00:28.616 ********* ok: [sut] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:16 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.021) 0:00:28.638 ********* ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2417574, "block_size": 4096, "block_total": 2552645, "block_used": 135071, "device": "/dev/sda1", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime,stripe=2048", "size_available": 9902383104, "size_total": 10455633920, "uuid": "a1a80642-183f-4413-8c90-88d0e56cb0c0" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2417574, "block_size": 4096, "block_total": 2552645, "block_used": 135071, "device": "/dev/sda1", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime,stripe=2048", "size_available": 9902383104, "size_total": 10455633920, "uuid": "a1a80642-183f-4413-8c90-88d0e56cb0c0" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:33 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.026) 0:00:28.664 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:46 Wednesday 31 May 2023 18:10:08 +0000 (0:00:00.018) 0:00:28.683 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:58 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.022) 0:00:28.705 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:66 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.021) 0:00:28.727 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:78 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.020) 0:00:28.748 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:90 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.018) 0:00:28.766 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:105 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.018) 0:00:28.785 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:117 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.023) 0:00:28.808 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:123 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.018) 0:00:28.827 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:129 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.018) 0:00:28.845 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:141 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.018) 0:00:28.864 ********* ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml:2 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.018) 0:00:28.882 ********* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml:40 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.064) 0:00:28.946 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml:48 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.023) 0:00:28.970 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml:58 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.021) 0:00:28.991 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml:71 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.017) 0:00:29.009 ********* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fs.yml:3 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.017) 0:00:29.026 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fs.yml:10 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.022) 0:00:29.048 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:3 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.026) 0:00:29.075 ********* ok: [sut] => { "changed": false, "stat": { "atime": 1685556605.6349344, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1685556604.3559103, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 721, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1685556604.3559103, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:9 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.208) 0:00:29.283 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:16 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.023) 0:00:29.307 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:24 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.019) 0:00:29.327 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:30 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.020) 0:00:29.348 ********* ok: [sut] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:34 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.019) 0:00:29.367 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:39 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.017) 0:00:29.385 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:3 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.020) 0:00:29.406 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:10 Wednesday 31 May 2023 18:10:09 +0000 (0:00:00.018) 0:00:29.424 ********* changed: [sut] => { "changed": true, "rc": 0, "results": [ "Installed: cryptsetup-2.4.3-2.fc36.x86_64" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:15 Wednesday 31 May 2023 18:10:12 +0000 (0:00:03.227) 0:00:32.652 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:21 Wednesday 31 May 2023 18:10:12 +0000 (0:00:00.021) 0:00:32.673 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:30 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.018) 0:00:32.691 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:43 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.025) 0:00:32.717 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:49 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.018) 0:00:32.735 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:54 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.017) 0:00:32.753 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:67 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.018) 0:00:32.772 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:79 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.018) 0:00:32.791 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:92 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.018) 0:00:32.809 ********* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:104 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.023) 0:00:32.832 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:112 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.021) 0:00:32.854 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:120 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.018) 0:00:32.873 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:129 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.020) 0:00:32.893 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:138 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.018) 0:00:32.911 ********* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:8 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.018) 0:00:32.930 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:14 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.018) 0:00:32.948 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:21 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.018) 0:00:32.967 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:28 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.019) 0:00:32.987 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:35 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.021) 0:00:33.008 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:44 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.023) 0:00:33.032 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:53 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.022) 0:00:33.054 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:3 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.025) 0:00:33.080 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:11 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.025) 0:00:33.105 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:20 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.022) 0:00:33.127 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:28 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.023) 0:00:33.150 ********* ok: [sut] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:32 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.019) 0:00:33.170 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:46 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.019) 0:00:33.190 ********* skipping: [sut] => {} TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:50 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.018) 0:00:33.209 ********* skipping: [sut] => {} TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:54 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.018) 0:00:33.228 ********* skipping: [sut] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:58 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.019) 0:00:33.247 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:68 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.023) 0:00:33.271 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:72 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.026) 0:00:33.297 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:77 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.020) 0:00:33.317 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:83 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.018) 0:00:33.336 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:88 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.018) 0:00:33.354 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:96 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.018) 0:00:33.373 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:104 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.019) 0:00:33.393 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:109 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.018) 0:00:33.411 ********* skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:113 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.018) 0:00:33.430 ********* skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:117 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.018) 0:00:33.448 ********* skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:121 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.017) 0:00:33.466 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:129 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.018) 0:00:33.484 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:138 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.019) 0:00:33.504 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:142 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.017) 0:00:33.522 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:150 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.018) 0:00:33.540 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:156 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.018) 0:00:33.558 ********* ok: [sut] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:160 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.020) 0:00:33.579 ********* ok: [sut] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:164 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.019) 0:00:33.599 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:5 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.026) 0:00:33.626 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:13 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.025) 0:00:33.651 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:18 Wednesday 31 May 2023 18:10:13 +0000 (0:00:00.020) 0:00:33.671 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:27 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.023) 0:00:33.695 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:35 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.020) 0:00:33.715 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:41 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.019) 0:00:33.735 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:47 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.021) 0:00:33.757 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume.yml:27 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.018) 0:00:33.775 ********* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:44 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.017) 0:00:33.793 ********* TASK [Clean up variable namespace] ********************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:54 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.015) 0:00:33.809 ********* ok: [sut] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Repeat the previous invocation minus fs_type to verify idempotence] ****** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/tests_create_partition_volume_then_remove.yml:43 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.016) 0:00:33.826 ********* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.063) 0:00:33.890 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.027) 0:00:33.917 ********* ok: [sut] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.339) 0:00:34.257 ********* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.048) 0:00:34.306 ********* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.017) 0:00:34.323 ********* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.018) 0:00:34.341 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.038) 0:00:34.380 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:9 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.018) 0:00:34.399 ********* ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "sda", "type": "partition", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.024) 0:00:34.423 ********* ok: [sut] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.019) 0:00:34.443 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:32 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.018) 0:00:34.462 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:39 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.018) 0:00:34.480 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:46 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.018) 0:00:34.499 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:53 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.019) 0:00:34.518 ********* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:67 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.032) 0:00:34.551 ********* TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:73 Wednesday 31 May 2023 18:10:14 +0000 (0:00:00.015) 0:00:34.566 ********* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "state": "mounted" } ], "packages": [ "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10729029632, "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:87 Wednesday 31 May 2023 18:10:16 +0000 (0:00:01.329) 0:00:35.896 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 31 May 2023 18:10:16 +0000 (0:00:00.018) 0:00:35.915 ********* TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:105 Wednesday 31 May 2023 18:10:16 +0000 (0:00:00.015) 0:00:35.931 ********* ok: [sut] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "state": "mounted" } ], "packages": [ "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10729029632, "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Wednesday 31 May 2023 18:10:16 +0000 (0:00:00.022) 0:00:35.953 ********* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10729029632, "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 31 May 2023 18:10:16 +0000 (0:00:00.022) 0:00:35.975 ********* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:130 Wednesday 31 May 2023 18:10:16 +0000 (0:00:00.020) 0:00:35.996 ********* TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:142 Wednesday 31 May 2023 18:10:16 +0000 (0:00:00.018) 0:00:36.014 ********* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:147 Wednesday 31 May 2023 18:10:16 +0000 (0:00:00.597) 0:00:36.612 ********* ok: [sut] => (item={'src': 'UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0', 'path': '/opt/test1', 'fstype': 'ext4', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0" } TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:159 Wednesday 31 May 2023 18:10:17 +0000 (0:00:00.215) 0:00:36.827 ********* skipping: [sut] => (item={'src': 'UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0', 'path': '/opt/test1', 'fstype': 'ext4', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:174 Wednesday 31 May 2023 18:10:17 +0000 (0:00:00.024) 0:00:36.852 ********* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:182 Wednesday 31 May 2023 18:10:17 +0000 (0:00:00.601) 0:00:37.454 ********* ok: [sut] => { "changed": false, "stat": { "atime": 1685556054.43803, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1684244424.757, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131081, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1684244183.529, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3816983141", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:187 Wednesday 31 May 2023 18:10:17 +0000 (0:00:00.204) 0:00:37.658 ********* TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:209 Wednesday 31 May 2023 18:10:17 +0000 (0:00:00.018) 0:00:37.676 ********* ok: [sut] TASK [Assert file system is preserved on existing partition volume] ************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/tests_create_partition_volume_then_remove.yml:56 Wednesday 31 May 2023 18:10:18 +0000 (0:00:00.601) 0:00:38.278 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify role results] ***************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/tests_create_partition_volume_then_remove.yml:63 Wednesday 31 May 2023 18:10:18 +0000 (0:00:00.024) 0:00:38.302 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml for sut TASK [Print out pool information] ********************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:2 Wednesday 31 May 2023 18:10:18 +0000 (0:00:00.033) 0:00:38.335 ********* ok: [sut] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10729029632, "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:7 Wednesday 31 May 2023 18:10:18 +0000 (0:00:00.048) 0:00:38.383 ********* skipping: [sut] => {} TASK [Collect info about the volumes.] ***************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:15 Wednesday 31 May 2023 18:10:18 +0000 (0:00:00.019) 0:00:38.402 ********* ok: [sut] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "ext4", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "a1a80642-183f-4413-8c90-88d0e56cb0c0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "f91a7ec7-5021-4d03-b280-c7f5e8053b5f" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:20 Wednesday 31 May 2023 18:10:18 +0000 (0:00:00.203) 0:00:38.606 ********* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:01.004590", "end": "2023-05-31 18:10:20.089008", "rc": 0, "start": "2023-05-31 18:10:19.084418" } STDOUT: # # /etc/fstab # Created by anaconda on Tue May 16 13:36:23 2023 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=f91a7ec7-5021-4d03-b280-c7f5e8053b5f / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0 /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:25 Wednesday 31 May 2023 18:10:20 +0000 (0:00:01.202) 0:00:39.809 ********* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003571", "end": "2023-05-31 18:10:20.291413", "failed_when_result": false, "rc": 0, "start": "2023-05-31 18:10:20.287842" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:34 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.200) 0:00:40.009 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool.yml for sut TASK [Set _storage_pool_tests] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool.yml:5 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.038) 0:00:40.048 ********* ok: [sut] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Verify pool subset] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool.yml:18 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.017) 0:00:40.066 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-volumes.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:2 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.039) 0:00:40.105 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:13 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.018) 0:00:40.123 ********* TASK [Set pvs lvm length] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:22 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.016) 0:00:40.140 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:27 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.018) 0:00:40.158 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:33 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.019) 0:00:40.178 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:42 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.018) 0:00:40.197 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:48 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.018) 0:00:40.215 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:54 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.018) 0:00:40.233 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:59 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.018) 0:00:40.251 ********* TASK [Check MD RAID] *********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:73 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.015) 0:00:40.267 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml for sut TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:8 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.035) 0:00:40.303 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:14 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.018) 0:00:40.321 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:21 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.018) 0:00:40.340 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:28 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.018) 0:00:40.358 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:35 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.019) 0:00:40.377 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:45 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.018) 0:00:40.396 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:55 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.018) 0:00:40.414 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:66 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.018) 0:00:40.433 ********* ok: [sut] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:76 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.017) 0:00:40.451 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-lvmraid.yml for sut TASK [Validate pool member LVM RAID settings] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-lvmraid.yml:2 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.039) 0:00:40.490 ********* skipping: [sut] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'ext4', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': 10729029632, 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10729029632, "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:79 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.026) 0:00:40.517 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-thin.yml for sut TASK [Validate pool member thinpool settings] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-thin.yml:2 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.041) 0:00:40.559 ********* skipping: [sut] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'ext4', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': 10729029632, 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10729029632, "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:82 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.026) 0:00:40.586 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-encryption.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-encryption.yml:5 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.044) 0:00:40.631 ********* ok: [sut] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-encryption.yml:13 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.022) 0:00:40.653 ********* TASK [Validate pool member crypttab entries] *********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-encryption.yml:20 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.018) 0:00:40.672 ********* TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-encryption.yml:27 Wednesday 31 May 2023 18:10:20 +0000 (0:00:00.016) 0:00:40.689 ********* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:85 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.020) 0:00:40.709 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-vdo.yml for sut TASK [Validate pool member VDO settings] *************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-vdo.yml:2 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.065) 0:00:40.775 ********* skipping: [sut] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'ext4', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': 10729029632, 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10729029632, "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Clean up test variables] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:88 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.027) 0:00:40.802 ********* ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-volumes.yml:3 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.019) 0:00:40.821 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume.yml for sut TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume.yml:2 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.036) 0:00:40.858 ********* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume.yml:21 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.024) 0:00:40.882 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:7 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.088) 0:00:40.971 ********* ok: [sut] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:16 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.024) 0:00:40.995 ********* ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2417574, "block_size": 4096, "block_total": 2552645, "block_used": 135071, "device": "/dev/sda1", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime,stripe=2048", "size_available": 9902383104, "size_total": 10455633920, "uuid": "a1a80642-183f-4413-8c90-88d0e56cb0c0" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2417574, "block_size": 4096, "block_total": 2552645, "block_used": 135071, "device": "/dev/sda1", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime,stripe=2048", "size_available": 9902383104, "size_total": 10455633920, "uuid": "a1a80642-183f-4413-8c90-88d0e56cb0c0" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:33 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.027) 0:00:41.023 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:46 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.019) 0:00:41.042 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:58 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.022) 0:00:41.065 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:66 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.022) 0:00:41.087 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:78 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.018) 0:00:41.106 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:90 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.019) 0:00:41.126 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:105 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.019) 0:00:41.145 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:117 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.024) 0:00:41.170 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:123 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.018) 0:00:41.189 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:129 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.017) 0:00:41.207 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:141 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.018) 0:00:41.225 ********* ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml:2 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.019) 0:00:41.244 ********* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml:40 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.037) 0:00:41.281 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml:48 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.022) 0:00:41.303 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml:58 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.021) 0:00:41.324 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml:71 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.017) 0:00:41.342 ********* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fs.yml:3 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.017) 0:00:41.360 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fs.yml:10 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.024) 0:00:41.384 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:3 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.025) 0:00:41.410 ********* ok: [sut] => { "changed": false, "stat": { "atime": 1685556615.4151185, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1685556615.4081182, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 721, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1685556615.4081182, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:9 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.210) 0:00:41.620 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:16 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.025) 0:00:41.645 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:24 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.019) 0:00:41.665 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:30 Wednesday 31 May 2023 18:10:21 +0000 (0:00:00.022) 0:00:41.687 ********* ok: [sut] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:34 Wednesday 31 May 2023 18:10:22 +0000 (0:00:00.019) 0:00:41.707 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:39 Wednesday 31 May 2023 18:10:22 +0000 (0:00:00.018) 0:00:41.726 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:3 Wednesday 31 May 2023 18:10:22 +0000 (0:00:00.022) 0:00:41.749 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:10 Wednesday 31 May 2023 18:10:22 +0000 (0:00:00.020) 0:00:41.769 ********* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:15 Wednesday 31 May 2023 18:10:24 +0000 (0:00:02.363) 0:00:44.132 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:21 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.018) 0:00:44.151 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:30 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.018) 0:00:44.170 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:43 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.025) 0:00:44.195 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:49 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.018) 0:00:44.214 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:54 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.019) 0:00:44.233 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:67 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.018) 0:00:44.252 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:79 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.018) 0:00:44.270 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:92 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.017) 0:00:44.288 ********* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:104 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.024) 0:00:44.312 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:112 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.022) 0:00:44.335 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:120 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.020) 0:00:44.355 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:129 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.018) 0:00:44.374 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:138 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.019) 0:00:44.393 ********* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:8 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.019) 0:00:44.413 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:14 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.019) 0:00:44.433 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:21 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.018) 0:00:44.451 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:28 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.020) 0:00:44.472 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:35 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.018) 0:00:44.491 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:44 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.018) 0:00:44.510 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:53 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.018) 0:00:44.529 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:3 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.018) 0:00:44.547 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:11 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.019) 0:00:44.567 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:20 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.021) 0:00:44.588 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:28 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.019) 0:00:44.608 ********* ok: [sut] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:32 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.020) 0:00:44.628 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:46 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.020) 0:00:44.648 ********* skipping: [sut] => {} TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:50 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.019) 0:00:44.667 ********* skipping: [sut] => {} TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:54 Wednesday 31 May 2023 18:10:24 +0000 (0:00:00.019) 0:00:44.687 ********* skipping: [sut] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:58 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.021) 0:00:44.708 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:68 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.019) 0:00:44.728 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:72 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.018) 0:00:44.746 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:77 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.019) 0:00:44.766 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:83 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.019) 0:00:44.785 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:88 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.018) 0:00:44.804 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:96 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.020) 0:00:44.824 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:104 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.018) 0:00:44.843 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:109 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.018) 0:00:44.862 ********* skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:113 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.018) 0:00:44.881 ********* skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:117 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.018) 0:00:44.899 ********* skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:121 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.018) 0:00:44.917 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:129 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.019) 0:00:44.937 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:138 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.018) 0:00:44.955 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:142 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.018) 0:00:44.973 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:150 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.019) 0:00:44.993 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:156 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.019) 0:00:45.013 ********* ok: [sut] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:160 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.021) 0:00:45.034 ********* ok: [sut] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:164 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.021) 0:00:45.056 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:5 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.021) 0:00:45.077 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:13 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.020) 0:00:45.097 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:18 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.018) 0:00:45.116 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:27 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.018) 0:00:45.134 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:35 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.018) 0:00:45.153 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:41 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.020) 0:00:45.173 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:47 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.019) 0:00:45.193 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume.yml:27 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.019) 0:00:45.213 ********* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:44 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.018) 0:00:45.231 ********* TASK [Clean up variable namespace] ********************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:54 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.015) 0:00:45.247 ********* ok: [sut] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the partition created above] ************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/tests_create_partition_volume_then_remove.yml:66 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.018) 0:00:45.265 ********* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.046) 0:00:45.311 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.027) 0:00:45.339 ********* ok: [sut] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 31 May 2023 18:10:25 +0000 (0:00:00.341) 0:00:45.680 ********* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 31 May 2023 18:10:26 +0000 (0:00:00.072) 0:00:45.753 ********* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 31 May 2023 18:10:26 +0000 (0:00:00.019) 0:00:45.772 ********* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 31 May 2023 18:10:26 +0000 (0:00:00.016) 0:00:45.789 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 31 May 2023 18:10:26 +0000 (0:00:00.037) 0:00:45.826 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:9 Wednesday 31 May 2023 18:10:26 +0000 (0:00:00.020) 0:00:45.846 ********* ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "sda", "state": "absent", "type": "partition", "volumes": [ { "mount_point": "/opt/test1", "name": "sda1", "state": "absent", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 31 May 2023 18:10:26 +0000 (0:00:00.024) 0:00:45.871 ********* ok: [sut] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 31 May 2023 18:10:26 +0000 (0:00:00.019) 0:00:45.891 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:32 Wednesday 31 May 2023 18:10:26 +0000 (0:00:00.018) 0:00:45.910 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:39 Wednesday 31 May 2023 18:10:26 +0000 (0:00:00.018) 0:00:45.929 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:46 Wednesday 31 May 2023 18:10:26 +0000 (0:00:00.018) 0:00:45.947 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:53 Wednesday 31 May 2023 18:10:26 +0000 (0:00:00.019) 0:00:45.967 ********* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:67 Wednesday 31 May 2023 18:10:26 +0000 (0:00:00.033) 0:00:46.000 ********* TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:73 Wednesday 31 May 2023 18:10:26 +0000 (0:00:00.016) 0:00:46.016 ********* changed: [sut] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "state": "absent" } ], "packages": [ "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_mount_id": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10729029632, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:87 Wednesday 31 May 2023 18:10:28 +0000 (0:00:01.674) 0:00:47.691 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 31 May 2023 18:10:28 +0000 (0:00:00.019) 0:00:47.711 ********* TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:105 Wednesday 31 May 2023 18:10:28 +0000 (0:00:00.015) 0:00:47.726 ********* ok: [sut] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "state": "absent" } ], "packages": [ "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_mount_id": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10729029632, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Wednesday 31 May 2023 18:10:28 +0000 (0:00:00.022) 0:00:47.748 ********* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_mount_id": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10729029632, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 31 May 2023 18:10:28 +0000 (0:00:00.023) 0:00:47.772 ********* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:130 Wednesday 31 May 2023 18:10:28 +0000 (0:00:00.019) 0:00:47.792 ********* changed: [sut] => (item={'src': 'UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:142 Wednesday 31 May 2023 18:10:28 +0000 (0:00:00.219) 0:00:48.011 ********* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:147 Wednesday 31 May 2023 18:10:28 +0000 (0:00:00.603) 0:00:48.615 ********* TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:159 Wednesday 31 May 2023 18:10:28 +0000 (0:00:00.020) 0:00:48.635 ********* TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:174 Wednesday 31 May 2023 18:10:28 +0000 (0:00:00.018) 0:00:48.654 ********* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:182 Wednesday 31 May 2023 18:10:29 +0000 (0:00:00.593) 0:00:49.248 ********* ok: [sut] => { "changed": false, "stat": { "atime": 1685556054.43803, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1684244424.757, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131081, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1684244183.529, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3816983141", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:187 Wednesday 31 May 2023 18:10:29 +0000 (0:00:00.203) 0:00:49.451 ********* TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:209 Wednesday 31 May 2023 18:10:29 +0000 (0:00:00.016) 0:00:49.468 ********* ok: [sut] TASK [Verify role results] ***************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/tests_create_partition_volume_then_remove.yml:81 Wednesday 31 May 2023 18:10:30 +0000 (0:00:00.596) 0:00:50.065 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml for sut TASK [Print out pool information] ********************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:2 Wednesday 31 May 2023 18:10:30 +0000 (0:00:00.034) 0:00:50.099 ********* ok: [sut] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_mount_id": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10729029632, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:7 Wednesday 31 May 2023 18:10:30 +0000 (0:00:00.022) 0:00:50.122 ********* skipping: [sut] => {} TASK [Collect info about the volumes.] ***************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:15 Wednesday 31 May 2023 18:10:30 +0000 (0:00:00.018) 0:00:50.141 ********* ok: [sut] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "f91a7ec7-5021-4d03-b280-c7f5e8053b5f" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:20 Wednesday 31 May 2023 18:10:30 +0000 (0:00:00.203) 0:00:50.345 ********* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003436", "end": "2023-05-31 18:10:30.829882", "rc": 0, "start": "2023-05-31 18:10:30.826446" } STDOUT: # # /etc/fstab # Created by anaconda on Tue May 16 13:36:23 2023 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=f91a7ec7-5021-4d03-b280-c7f5e8053b5f / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:25 Wednesday 31 May 2023 18:10:30 +0000 (0:00:00.202) 0:00:50.547 ********* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003634", "end": "2023-05-31 18:10:31.034311", "failed_when_result": false, "rc": 0, "start": "2023-05-31 18:10:31.030677" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:34 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.205) 0:00:50.753 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool.yml for sut TASK [Set _storage_pool_tests] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool.yml:5 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.040) 0:00:50.793 ********* ok: [sut] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Verify pool subset] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool.yml:18 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.017) 0:00:50.810 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-volumes.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:2 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.038) 0:00:50.848 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:13 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.018) 0:00:50.867 ********* TASK [Set pvs lvm length] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:22 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.038) 0:00:50.906 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:27 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.019) 0:00:50.925 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:33 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.018) 0:00:50.943 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:42 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.017) 0:00:50.961 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:48 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.017) 0:00:50.979 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:54 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.019) 0:00:50.999 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:59 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.019) 0:00:51.018 ********* TASK [Check MD RAID] *********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:73 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.015) 0:00:51.033 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml for sut TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:8 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.033) 0:00:51.066 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:14 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.017) 0:00:51.084 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:21 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.018) 0:00:51.103 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:28 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.018) 0:00:51.121 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:35 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.018) 0:00:51.139 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:45 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.018) 0:00:51.158 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:55 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.018) 0:00:51.176 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:66 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.018) 0:00:51.195 ********* ok: [sut] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:76 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.019) 0:00:51.214 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-lvmraid.yml for sut TASK [Validate pool member LVM RAID settings] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-lvmraid.yml:2 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.039) 0:00:51.254 ********* skipping: [sut] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'ext4', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'sda1', 'raid_level': None, 'size': 10729029632, 'state': 'absent', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_mount_id": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10729029632, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:79 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.024) 0:00:51.278 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-thin.yml for sut TASK [Validate pool member thinpool settings] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-thin.yml:2 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.036) 0:00:51.314 ********* skipping: [sut] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'ext4', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'sda1', 'raid_level': None, 'size': 10729029632, 'state': 'absent', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_mount_id": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10729029632, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:82 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.024) 0:00:51.339 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-encryption.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-encryption.yml:5 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.040) 0:00:51.379 ********* ok: [sut] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-encryption.yml:13 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.021) 0:00:51.400 ********* TASK [Validate pool member crypttab entries] *********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-encryption.yml:20 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.016) 0:00:51.417 ********* TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-encryption.yml:27 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.016) 0:00:51.433 ********* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:85 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.018) 0:00:51.452 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-vdo.yml for sut TASK [Validate pool member VDO settings] *************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-vdo.yml:2 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.041) 0:00:51.493 ********* skipping: [sut] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'ext4', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'sda1', 'raid_level': None, 'size': 10729029632, 'state': 'absent', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_mount_id": "UUID=a1a80642-183f-4413-8c90-88d0e56cb0c0", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10729029632, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Clean up test variables] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:88 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.027) 0:00:51.520 ********* ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-volumes.yml:3 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.018) 0:00:51.539 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume.yml for sut TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume.yml:2 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.036) 0:00:51.575 ********* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume.yml:21 Wednesday 31 May 2023 18:10:31 +0000 (0:00:00.022) 0:00:51.598 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:7 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.097) 0:00:51.695 ********* ok: [sut] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:16 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.023) 0:00:51.719 ********* ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:33 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.025) 0:00:51.745 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:46 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.020) 0:00:51.765 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:58 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.018) 0:00:51.783 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:66 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.022) 0:00:51.806 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:78 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.018) 0:00:51.825 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:90 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.018) 0:00:51.843 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:105 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.018) 0:00:51.862 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:117 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.018) 0:00:51.880 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:123 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.017) 0:00:51.898 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:129 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.019) 0:00:51.918 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:141 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.018) 0:00:51.936 ********* ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml:2 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.018) 0:00:51.954 ********* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml:40 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.037) 0:00:51.991 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml:48 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.018) 0:00:52.010 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml:58 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.023) 0:00:52.034 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml:71 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.043) 0:00:52.077 ********* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fs.yml:3 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.019) 0:00:52.097 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fs.yml:10 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.018) 0:00:52.115 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:3 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.018) 0:00:52.134 ********* ok: [sut] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:9 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.195) 0:00:52.329 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:16 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.019) 0:00:52.349 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:24 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.023) 0:00:52.373 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:30 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.014) 0:00:52.387 ********* ok: [sut] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:34 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.019) 0:00:52.407 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:39 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.019) 0:00:52.426 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:3 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.014) 0:00:52.441 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:10 Wednesday 31 May 2023 18:10:32 +0000 (0:00:00.018) 0:00:52.459 ********* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:15 Wednesday 31 May 2023 18:10:35 +0000 (0:00:02.321) 0:00:54.780 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:21 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.020) 0:00:54.801 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:30 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.018) 0:00:54.820 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:43 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.015) 0:00:54.835 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:49 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.018) 0:00:54.853 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:54 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.019) 0:00:54.873 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:67 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.015) 0:00:54.888 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:79 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.015) 0:00:54.903 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:92 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.015) 0:00:54.919 ********* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:104 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.026) 0:00:54.945 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:112 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.023) 0:00:54.969 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:120 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.019) 0:00:54.989 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:129 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.018) 0:00:55.008 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:138 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.018) 0:00:55.027 ********* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:8 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.017) 0:00:55.045 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:14 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.020) 0:00:55.065 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:21 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.020) 0:00:55.085 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:28 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.019) 0:00:55.104 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:35 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.017) 0:00:55.122 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:44 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.018) 0:00:55.140 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:53 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.017) 0:00:55.158 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:3 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.019) 0:00:55.177 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:11 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.017) 0:00:55.194 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:20 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.017) 0:00:55.212 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:28 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.017) 0:00:55.229 ********* ok: [sut] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:32 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.018) 0:00:55.248 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:46 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.017) 0:00:55.266 ********* skipping: [sut] => {} TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:50 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.020) 0:00:55.286 ********* skipping: [sut] => {} TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:54 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.019) 0:00:55.306 ********* skipping: [sut] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:58 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.019) 0:00:55.325 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:68 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.018) 0:00:55.344 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:72 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.018) 0:00:55.363 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:77 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.018) 0:00:55.382 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:83 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.019) 0:00:55.401 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:88 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.018) 0:00:55.420 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:96 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.018) 0:00:55.438 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:104 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.018) 0:00:55.456 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:109 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.019) 0:00:55.476 ********* skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:113 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.018) 0:00:55.494 ********* skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:117 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.020) 0:00:55.514 ********* skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:121 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.018) 0:00:55.533 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:129 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.019) 0:00:55.553 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:138 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.019) 0:00:55.572 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:142 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.019) 0:00:55.592 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:150 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.021) 0:00:55.613 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:156 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.019) 0:00:55.633 ********* ok: [sut] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:160 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.021) 0:00:55.654 ********* ok: [sut] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:164 Wednesday 31 May 2023 18:10:35 +0000 (0:00:00.020) 0:00:55.675 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:5 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.021) 0:00:55.696 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:13 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.020) 0:00:55.717 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:18 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.018) 0:00:55.736 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:27 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.020) 0:00:55.757 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:35 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.019) 0:00:55.776 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:41 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.019) 0:00:55.795 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:47 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.018) 0:00:55.814 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume.yml:27 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.019) 0:00:55.833 ********* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:44 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.016) 0:00:55.850 ********* TASK [Clean up variable namespace] ********************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:54 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.017) 0:00:55.867 ********* ok: [sut] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/tests_create_partition_volume_then_remove.yml:84 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.017) 0:00:55.884 ********* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.044) 0:00:55.929 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.053) 0:00:55.983 ********* ok: [sut] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.341) 0:00:56.324 ********* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.045) 0:00:56.369 ********* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.017) 0:00:56.387 ********* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.017) 0:00:56.404 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.040) 0:00:56.444 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:9 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.019) 0:00:56.464 ********* ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "sda", "state": "absent", "type": "partition", "volumes": [ { "mount_point": "/opt/test1", "name": "sda1", "state": "absent", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.022) 0:00:56.487 ********* ok: [sut] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.019) 0:00:56.506 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:32 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.017) 0:00:56.524 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:39 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.019) 0:00:56.544 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:46 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.018) 0:00:56.562 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:53 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.018) 0:00:56.580 ********* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:67 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.030) 0:00:56.611 ********* TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:73 Wednesday 31 May 2023 18:10:36 +0000 (0:00:00.015) 0:00:56.626 ********* changed: [sut] => { "actions": [], "changed": true, "crypts": [], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [], "packages": [ "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:87 Wednesday 31 May 2023 18:10:38 +0000 (0:00:01.224) 0:00:57.851 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 31 May 2023 18:10:38 +0000 (0:00:00.018) 0:00:57.870 ********* TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:105 Wednesday 31 May 2023 18:10:38 +0000 (0:00:00.016) 0:00:57.886 ********* ok: [sut] => { "blivet_output": { "actions": [], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [], "packages": [ "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Wednesday 31 May 2023 18:10:38 +0000 (0:00:00.022) 0:00:57.909 ********* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 31 May 2023 18:10:38 +0000 (0:00:00.021) 0:00:57.930 ********* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:130 Wednesday 31 May 2023 18:10:38 +0000 (0:00:00.021) 0:00:57.951 ********* TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:142 Wednesday 31 May 2023 18:10:38 +0000 (0:00:00.018) 0:00:57.970 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:147 Wednesday 31 May 2023 18:10:38 +0000 (0:00:00.019) 0:00:57.989 ********* TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:159 Wednesday 31 May 2023 18:10:38 +0000 (0:00:00.017) 0:00:58.007 ********* TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:174 Wednesday 31 May 2023 18:10:38 +0000 (0:00:00.018) 0:00:58.026 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:182 Wednesday 31 May 2023 18:10:38 +0000 (0:00:00.018) 0:00:58.045 ********* ok: [sut] => { "changed": false, "stat": { "atime": 1685556054.43803, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1684244424.757, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131081, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1684244183.529, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3816983141", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:187 Wednesday 31 May 2023 18:10:38 +0000 (0:00:00.205) 0:00:58.250 ********* TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:209 Wednesday 31 May 2023 18:10:38 +0000 (0:00:00.018) 0:00:58.269 ********* ok: [sut] TASK [Verify role results] ***************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/tests_create_partition_volume_then_remove.yml:99 Wednesday 31 May 2023 18:10:39 +0000 (0:00:00.594) 0:00:58.863 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml for sut TASK [Print out pool information] ********************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:2 Wednesday 31 May 2023 18:10:39 +0000 (0:00:00.038) 0:00:58.902 ********* ok: [sut] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:7 Wednesday 31 May 2023 18:10:39 +0000 (0:00:00.023) 0:00:58.925 ********* skipping: [sut] => {} TASK [Collect info about the volumes.] ***************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:15 Wednesday 31 May 2023 18:10:39 +0000 (0:00:00.017) 0:00:58.943 ********* ok: [sut] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "f91a7ec7-5021-4d03-b280-c7f5e8053b5f" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:20 Wednesday 31 May 2023 18:10:39 +0000 (0:00:00.207) 0:00:59.150 ********* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:01.004466", "end": "2023-05-31 18:10:40.635913", "rc": 0, "start": "2023-05-31 18:10:39.631447" } STDOUT: # # /etc/fstab # Created by anaconda on Tue May 16 13:36:23 2023 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=f91a7ec7-5021-4d03-b280-c7f5e8053b5f / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:25 Wednesday 31 May 2023 18:10:40 +0000 (0:00:01.204) 0:01:00.355 ********* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003398", "end": "2023-05-31 18:10:40.840442", "failed_when_result": false, "rc": 0, "start": "2023-05-31 18:10:40.837044" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:34 Wednesday 31 May 2023 18:10:40 +0000 (0:00:00.203) 0:01:00.558 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool.yml for sut TASK [Set _storage_pool_tests] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool.yml:5 Wednesday 31 May 2023 18:10:40 +0000 (0:00:00.039) 0:01:00.597 ********* ok: [sut] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Verify pool subset] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool.yml:18 Wednesday 31 May 2023 18:10:40 +0000 (0:00:00.019) 0:01:00.616 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-volumes.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:2 Wednesday 31 May 2023 18:10:40 +0000 (0:00:00.064) 0:01:00.681 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:13 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.020) 0:01:00.701 ********* TASK [Set pvs lvm length] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:22 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.016) 0:01:00.717 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:27 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.017) 0:01:00.735 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:33 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.019) 0:01:00.755 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:42 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.018) 0:01:00.774 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:48 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.018) 0:01:00.792 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:54 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.017) 0:01:00.810 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:59 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.018) 0:01:00.828 ********* TASK [Check MD RAID] *********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:73 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.015) 0:01:00.844 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml for sut TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:8 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.037) 0:01:00.881 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:14 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.021) 0:01:00.903 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:21 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.018) 0:01:00.922 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:28 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.017) 0:01:00.939 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:35 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.018) 0:01:00.958 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:45 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.018) 0:01:00.976 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:55 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.018) 0:01:00.995 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-md.yml:66 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.018) 0:01:01.014 ********* ok: [sut] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:76 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.018) 0:01:01.032 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-lvmraid.yml for sut TASK [Validate pool member LVM RAID settings] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-lvmraid.yml:2 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.036) 0:01:01.069 ********* skipping: [sut] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'sda1', 'raid_level': None, 'size': 0, 'state': 'absent', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '', '_raw_device': '', '_mount_id': ''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:79 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.025) 0:01:01.095 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-thin.yml for sut TASK [Validate pool member thinpool settings] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-thin.yml:2 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.039) 0:01:01.134 ********* skipping: [sut] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'sda1', 'raid_level': None, 'size': 0, 'state': 'absent', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '', '_raw_device': '', '_mount_id': ''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:82 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.026) 0:01:01.160 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-encryption.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-encryption.yml:5 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.044) 0:01:01.204 ********* ok: [sut] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-encryption.yml:13 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.023) 0:01:01.228 ********* TASK [Validate pool member crypttab entries] *********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-encryption.yml:20 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.016) 0:01:01.245 ********* TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-encryption.yml:27 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.017) 0:01:01.262 ********* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:85 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.019) 0:01:01.282 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-vdo.yml for sut TASK [Validate pool member VDO settings] *************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-pool-members-vdo.yml:2 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.039) 0:01:01.322 ********* skipping: [sut] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'sda1', 'raid_level': None, 'size': 0, 'state': 'absent', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '', '_raw_device': '', '_mount_id': ''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Clean up test variables] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-members.yml:88 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.024) 0:01:01.346 ********* ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-pool-volumes.yml:3 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.017) 0:01:01.364 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume.yml for sut TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume.yml:2 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.037) 0:01:01.402 ********* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume.yml:21 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.021) 0:01:01.424 ********* included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:7 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.084) 0:01:01.508 ********* ok: [sut] => { "ansible_facts": { "storage_test_device_path": "" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:16 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.023) 0:01:01.531 ********* ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:33 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.026) 0:01:01.558 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:46 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.019) 0:01:01.577 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:58 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.018) 0:01:01.596 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:66 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.021) 0:01:01.617 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:78 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.018) 0:01:01.635 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:90 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.019) 0:01:01.655 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:105 Wednesday 31 May 2023 18:10:41 +0000 (0:00:00.018) 0:01:01.673 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:117 Wednesday 31 May 2023 18:10:42 +0000 (0:00:00.019) 0:01:01.693 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:123 Wednesday 31 May 2023 18:10:42 +0000 (0:00:00.019) 0:01:01.712 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:129 Wednesday 31 May 2023 18:10:42 +0000 (0:00:00.020) 0:01:01.733 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-mount.yml:141 Wednesday 31 May 2023 18:10:42 +0000 (0:00:00.019) 0:01:01.752 ********* ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml:2 Wednesday 31 May 2023 18:10:42 +0000 (0:00:00.052) 0:01:01.805 ********* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml:40 Wednesday 31 May 2023 18:10:42 +0000 (0:00:00.038) 0:01:01.844 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml:48 Wednesday 31 May 2023 18:10:42 +0000 (0:00:00.019) 0:01:01.863 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml:58 Wednesday 31 May 2023 18:10:42 +0000 (0:00:00.022) 0:01:01.886 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fstab.yml:71 Wednesday 31 May 2023 18:10:42 +0000 (0:00:00.018) 0:01:01.904 ********* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fs.yml:3 Wednesday 31 May 2023 18:10:42 +0000 (0:00:00.017) 0:01:01.922 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-fs.yml:10 Wednesday 31 May 2023 18:10:42 +0000 (0:00:00.019) 0:01:01.942 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:3 Wednesday 31 May 2023 18:10:42 +0000 (0:00:00.018) 0:01:01.961 ********* ok: [sut] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:9 Wednesday 31 May 2023 18:10:42 +0000 (0:00:00.196) 0:01:02.158 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:16 Wednesday 31 May 2023 18:10:42 +0000 (0:00:00.020) 0:01:02.178 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:24 Wednesday 31 May 2023 18:10:42 +0000 (0:00:00.024) 0:01:02.203 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:30 Wednesday 31 May 2023 18:10:42 +0000 (0:00:00.016) 0:01:02.219 ********* ok: [sut] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:34 Wednesday 31 May 2023 18:10:42 +0000 (0:00:00.021) 0:01:02.241 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-device.yml:39 Wednesday 31 May 2023 18:10:42 +0000 (0:00:00.020) 0:01:02.261 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:3 Wednesday 31 May 2023 18:10:42 +0000 (0:00:00.017) 0:01:02.278 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:10 Wednesday 31 May 2023 18:10:42 +0000 (0:00:00.020) 0:01:02.299 ********* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:15 Wednesday 31 May 2023 18:10:44 +0000 (0:00:02.322) 0:01:04.621 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:21 Wednesday 31 May 2023 18:10:44 +0000 (0:00:00.020) 0:01:04.642 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:30 Wednesday 31 May 2023 18:10:44 +0000 (0:00:00.018) 0:01:04.660 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:43 Wednesday 31 May 2023 18:10:44 +0000 (0:00:00.015) 0:01:04.676 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:49 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.019) 0:01:04.695 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:54 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.020) 0:01:04.715 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:67 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.015) 0:01:04.730 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:79 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.014) 0:01:04.745 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:92 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.014) 0:01:04.760 ********* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:104 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.023) 0:01:04.784 ********* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:112 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.021) 0:01:04.806 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:120 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.019) 0:01:04.826 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:129 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.019) 0:01:04.846 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:138 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.018) 0:01:04.864 ********* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:8 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.018) 0:01:04.882 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:14 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.018) 0:01:04.900 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:21 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.018) 0:01:04.919 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:28 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.019) 0:01:04.939 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:35 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.018) 0:01:04.957 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:44 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.018) 0:01:04.976 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-md.yml:53 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.019) 0:01:04.995 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:3 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.018) 0:01:05.014 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:11 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.017) 0:01:05.031 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:20 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.019) 0:01:05.050 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:28 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.017) 0:01:05.068 ********* ok: [sut] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:32 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.018) 0:01:05.086 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:46 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.021) 0:01:05.108 ********* skipping: [sut] => {} TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:50 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.018) 0:01:05.126 ********* skipping: [sut] => {} TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:54 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.017) 0:01:05.143 ********* skipping: [sut] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:58 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.018) 0:01:05.162 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:68 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.018) 0:01:05.180 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:72 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.019) 0:01:05.200 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:77 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.019) 0:01:05.220 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:83 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.017) 0:01:05.238 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:88 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.017) 0:01:05.255 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:96 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.019) 0:01:05.274 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:104 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.017) 0:01:05.292 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:109 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.018) 0:01:05.311 ********* skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:113 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.020) 0:01:05.331 ********* skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:117 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.018) 0:01:05.350 ********* skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:121 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.019) 0:01:05.369 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:129 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.020) 0:01:05.389 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:138 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.018) 0:01:05.408 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:142 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.018) 0:01:05.426 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:150 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.018) 0:01:05.445 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:156 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.018) 0:01:05.463 ********* ok: [sut] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:160 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.021) 0:01:05.485 ********* ok: [sut] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-size.yml:164 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.022) 0:01:05.507 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:5 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.019) 0:01:05.526 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:13 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.019) 0:01:05.545 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:18 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.018) 0:01:05.564 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:27 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.020) 0:01:05.585 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:35 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.019) 0:01:05.604 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:41 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.020) 0:01:05.625 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-cache.yml:47 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.020) 0:01:05.645 ********* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume.yml:27 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.020) 0:01:05.666 ********* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:44 Wednesday 31 May 2023 18:10:45 +0000 (0:00:00.018) 0:01:05.685 ********* TASK [Clean up variable namespace] ********************************************* task path: /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:54 Wednesday 31 May 2023 18:10:46 +0000 (0:00:00.016) 0:01:05.701 ********* ok: [sut] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* sut : ok=302 changed=7 unreachable=0 failed=0 skipped=404 rescued=0 ignored=0 Wednesday 31 May 2023 18:10:46 +0000 (0:00:00.008) 0:01:05.709 ********* =============================================================================== linux-system-roles.storage : Make sure blivet is available ------------- 10.56s /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 linux-system-roles.storage : Manage the pools and volumes to match the specified state --- 3.61s /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:73 Ensure cryptsetup is present -------------------------------------------- 3.23s /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:10 linux-system-roles.storage : Make sure required packages are installed --- 2.44s /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:39 Ensure cryptsetup is present -------------------------------------------- 2.36s /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:10 Ensure cryptsetup is present -------------------------------------------- 2.32s /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:10 Ensure cryptsetup is present -------------------------------------------- 2.32s /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/test-verify-volume-encryption.yml:10 linux-system-roles.storage : Get service facts -------------------------- 2.18s /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:46 linux-system-roles.storage : Manage the pools and volumes to match the specified state --- 1.67s /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:73 linux-system-roles.storage : Manage the pools and volumes to match the specified state --- 1.33s /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:73 linux-system-roles.storage : Manage the pools and volumes to match the specified state --- 1.22s /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:73 Read the /etc/fstab file for volume existence --------------------------- 1.20s /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:20 Read the /etc/fstab file for volume existence --------------------------- 1.20s /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/verify-role-results.yml:20 Gathering Facts --------------------------------------------------------- 0.82s /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/tests_create_partition_volume_then_remove_scsi_generated.yml:3 linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab --- 0.76s /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:142 linux-system-roles.storage : Get required packages ---------------------- 0.62s /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 linux-system-roles.storage : Update facts ------------------------------- 0.62s /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:209 linux-system-roles.storage : Update facts ------------------------------- 0.61s /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:209 linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab --- 0.60s /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:142 linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab --- 0.60s /WORKDIR/git-main_lvm-raid-stripe-sizeq7sss_ht/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:174 ---^---^---^---^---^--- # STDERR: ---v---v---v---v---v--- /opt/ansible-2.9/lib/python3.6/site-packages/ansible/parsing/vault/__init__.py:44: CryptographyDeprecationWarning: Python 3.6 is no longer supported by the Python core team. Therefore, support for it is deprecated in cryptography. The next release of cryptography will remove support for Python 3.6. from cryptography.exceptions import InvalidSignature [DEPRECATION WARNING]: Distribution fedora 36 on host sut should use /usr/bin/python3, but is using /usr/bin/python for backward compatibility with prior Ansible releases. A future Ansible release will default to using the discovered platform python for this host. See https://docs.ansible.com/ansible/ 2.9/reference_appendices/interpreter_discovery.html for more information. This feature will be removed in version 2.12. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. ---^---^---^---^---^---