# STDOUT: ---v---v---v---v---v--- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = ['/home/jenkins/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /opt/ansible-2.9/lib/python3.9/site-packages/ansible executable location = /opt/ansible-2.9/bin/ansible-playbook python version = 3.9.18 (main, Sep 7 2023, 00:00:00) [GCC 11.4.1 20230605 (Red Hat 11.4.1-2)] Using /etc/ansible/ansible.cfg as config file statically imported: /WORKDIR/git-weekly-civnd1vj4n/tests/create-test-file.yml statically imported: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-data-preservation.yml statically imported: /WORKDIR/git-weekly-civnd1vj4n/tests/create-test-file.yml statically imported: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-data-preservation.yml statically imported: /WORKDIR/git-weekly-civnd1vj4n/tests/create-test-file.yml statically imported: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-data-preservation.yml statically imported: /WORKDIR/git-weekly-civnd1vj4n/tests/create-test-file.yml statically imported: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-data-preservation.yml statically imported: /WORKDIR/git-weekly-civnd1vj4n/tests/create-test-file.yml statically imported: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-data-preservation.yml statically imported: /WORKDIR/git-weekly-civnd1vj4n/tests/create-test-file.yml statically imported: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-data-preservation.yml Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_luks.yml ******************************************************* 1 plays in /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml PLAY [Test LUKS] *************************************************************** TASK [Gathering Facts] ********************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:2 Sunday 03 December 2023 04:30:33 +0000 (0:00:00.017) 0:00:00.017 ******* ok: [sut] META: ran handlers TASK [Enable FIPS mode] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:21 Sunday 03 December 2023 04:30:35 +0000 (0:00:02.074) 0:00:02.091 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:25 Sunday 03 December 2023 04:30:35 +0000 (0:00:00.024) 0:00:02.116 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure dracut-fips] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:35 Sunday 03 December 2023 04:30:35 +0000 (0:00:00.023) 0:00:02.139 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Configure boot for FIPS] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:42 Sunday 03 December 2023 04:30:35 +0000 (0:00:00.023) 0:00:02.163 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:51 Sunday 03 December 2023 04:30:35 +0000 (0:00:00.022) 0:00:02.186 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:55 Sunday 03 December 2023 04:30:36 +0000 (0:00:00.021) 0:00:02.208 ******* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 03 December 2023 04:30:36 +0000 (0:00:00.024) 0:00:02.233 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 03 December 2023 04:30:36 +0000 (0:00:00.028) 0:00:02.261 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 03 December 2023 04:30:36 +0000 (0:00:00.024) 0:00:02.286 ******* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 03 December 2023 04:30:36 +0000 (0:00:00.041) 0:00:02.327 ******* ok: [sut] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 03 December 2023 04:30:36 +0000 (0:00:00.266) 0:00:02.594 ******* ok: [sut] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 03 December 2023 04:30:36 +0000 (0:00:00.025) 0:00:02.620 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 03 December 2023 04:30:36 +0000 (0:00:00.010) 0:00:02.631 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 03 December 2023 04:30:36 +0000 (0:00:00.010) 0:00:02.641 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 03 December 2023 04:30:36 +0000 (0:00:00.050) 0:00:02.692 ******* changed: [sut] => { "changed": true, "rc": 0, "results": [ "Installed: iniparser-4.1-11.fc37.x86_64", "Installed: blivet-data-1:3.5.0-2.fc37.noarch", "Installed: python3-pyparted-1:3.12.0-6.fc37.x86_64", "Installed: libblockdev-btrfs-2.28-2.fc37.x86_64", "Installed: libblockdev-dm-2.28-2.fc37.x86_64", "Installed: device-mapper-multipath-0.9.0-4.fc37.x86_64", "Installed: device-mapper-multipath-libs-0.9.0-4.fc37.x86_64", "Installed: libblockdev-kbd-2.28-2.fc37.x86_64", "Installed: lzo-2.10-7.fc37.x86_64", "Installed: libblockdev-lvm-2.28-2.fc37.x86_64", "Installed: btrfs-progs-6.5.1-1.fc37.x86_64", "Installed: dmraid-1.0.0.rc16-53.fc37.x86_64", "Installed: dmraid-events-1.0.0.rc16-53.fc37.x86_64", "Installed: libblockdev-mpath-2.28-2.fc37.x86_64", "Installed: dmraid-libs-1.0.0.rc16-53.fc37.x86_64", "Installed: libblockdev-nvdimm-2.28-2.fc37.x86_64", "Installed: cxl-libs-78-1.fc37.x86_64", "Installed: lsof-4.94.0-4.fc37.x86_64", "Installed: python3-bytesize-2.10-1.fc37.x86_64", "Installed: daxctl-libs-78-1.fc37.x86_64", "Installed: libbytesize-2.10-1.fc37.x86_64", "Installed: ndctl-78-1.fc37.x86_64", "Installed: ndctl-libs-78-1.fc37.x86_64", "Installed: python3-blivet-1:3.5.0-2.fc37.noarch", "Installed: python3-blockdev-2.28-2.fc37.x86_64", "Installed: bcache-tools-1.1-3.fc37.x86_64", "Installed: sgpio-1.2.0.10-35.fc37.x86_64", "Removed: libbytesize-2.9-1.fc37.x86_64" ] } lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 03 December 2023 04:30:43 +0000 (0:00:06.964) 0:00:09.657 ******* ok: [sut] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 03 December 2023 04:30:43 +0000 (0:00:00.028) 0:00:09.686 ******* ok: [sut] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 03 December 2023 04:30:43 +0000 (0:00:00.026) 0:00:09.712 ******* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 03 December 2023 04:30:44 +0000 (0:00:00.592) 0:00:10.305 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 03 December 2023 04:30:44 +0000 (0:00:00.042) 0:00:10.348 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 03 December 2023 04:30:44 +0000 (0:00:00.020) 0:00:10.368 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 03 December 2023 04:30:44 +0000 (0:00:00.013) 0:00:10.381 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 03 December 2023 04:30:44 +0000 (0:00:00.022) 0:00:10.404 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 03 December 2023 04:30:46 +0000 (0:00:02.591) 0:00:12.995 ******* ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "inactive", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 03 December 2023 04:30:48 +0000 (0:00:02.163) 0:00:15.159 ******* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 03 December 2023 04:30:48 +0000 (0:00:00.023) 0:00:15.183 ******* TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 03 December 2023 04:30:48 +0000 (0:00:00.012) 0:00:15.196 ******* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:98 Sunday 03 December 2023 04:30:49 +0000 (0:00:00.376) 0:00:15.572 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 03 December 2023 04:30:49 +0000 (0:00:00.016) 0:00:15.588 ******* TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:116 Sunday 03 December 2023 04:30:49 +0000 (0:00:00.012) 0:00:15.601 ******* ok: [sut] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:121 Sunday 03 December 2023 04:30:49 +0000 (0:00:00.016) 0:00:15.618 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:125 Sunday 03 December 2023 04:30:49 +0000 (0:00:00.016) 0:00:15.634 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:141 Sunday 03 December 2023 04:30:49 +0000 (0:00:00.015) 0:00:15.650 ******* TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:153 Sunday 03 December 2023 04:30:49 +0000 (0:00:00.014) 0:00:15.664 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:158 Sunday 03 December 2023 04:30:49 +0000 (0:00:00.014) 0:00:15.679 ******* TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:170 Sunday 03 December 2023 04:30:49 +0000 (0:00:00.014) 0:00:15.693 ******* TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:185 Sunday 03 December 2023 04:30:49 +0000 (0:00:00.014) 0:00:15.708 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:193 Sunday 03 December 2023 04:30:49 +0000 (0:00:00.016) 0:00:15.724 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701577272.732126, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1696931323.96, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131081, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1696930864.766, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "781860367", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Sunday 03 December 2023 04:30:49 +0000 (0:00:00.211) 0:00:15.936 ******* TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:220 Sunday 03 December 2023 04:30:49 +0000 (0:00:00.014) 0:00:15.950 ******* ok: [sut] TASK [Get unused disks] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:59 Sunday 03 December 2023 04:30:50 +0000 (0:00:00.805) 0:00:16.755 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/get_unused_disk.yml for sut TASK [Ensure test packages] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/get_unused_disk.yml:2 Sunday 03 December 2023 04:30:50 +0000 (0:00:00.024) 0:00:16.780 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: util-linux-core TASK [Find unused disks in the system] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/get_unused_disk.yml:16 Sunday 03 December 2023 04:30:53 +0000 (0:00:02.589) 0:00:19.369 ******* ok: [sut] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/get_unused_disk.yml:24 Sunday 03 December 2023 04:30:53 +0000 (0:00:00.298) 0:00:19.667 ******* ok: [sut] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/get_unused_disk.yml:29 Sunday 03 December 2023 04:30:53 +0000 (0:00:00.018) 0:00:19.685 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/get_unused_disk.yml:34 Sunday 03 December 2023 04:30:53 +0000 (0:00:00.015) 0:00:19.701 ******* ok: [sut] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:68 Sunday 03 December 2023 04:30:53 +0000 (0:00:00.015) 0:00:19.716 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml for sut TASK [Store global variable value copy] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:4 Sunday 03 December 2023 04:30:53 +0000 (0:00:00.026) 0:00:19.742 ******* ok: [sut] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:10 Sunday 03 December 2023 04:30:53 +0000 (0:00:00.017) 0:00:19.760 ******* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 03 December 2023 04:30:53 +0000 (0:00:00.022) 0:00:19.783 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 03 December 2023 04:30:53 +0000 (0:00:00.040) 0:00:19.823 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 03 December 2023 04:30:53 +0000 (0:00:00.017) 0:00:19.841 ******* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 03 December 2023 04:30:53 +0000 (0:00:00.044) 0:00:19.886 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 03 December 2023 04:30:53 +0000 (0:00:00.015) 0:00:19.901 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 03 December 2023 04:30:53 +0000 (0:00:00.013) 0:00:19.915 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 03 December 2023 04:30:53 +0000 (0:00:00.013) 0:00:19.929 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 03 December 2023 04:30:53 +0000 (0:00:00.012) 0:00:19.941 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 03 December 2023 04:30:53 +0000 (0:00:00.030) 0:00:19.972 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 03 December 2023 04:30:56 +0000 (0:00:02.617) 0:00:22.590 ******* ok: [sut] => { "storage_pools": [] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 03 December 2023 04:30:56 +0000 (0:00:00.019) 0:00:22.609 ******* ok: [sut] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 03 December 2023 04:30:56 +0000 (0:00:00.019) 0:00:22.629 ******* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 03 December 2023 04:30:59 +0000 (0:00:02.690) 0:00:25.319 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 03 December 2023 04:30:59 +0000 (0:00:00.032) 0:00:25.351 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 03 December 2023 04:30:59 +0000 (0:00:00.025) 0:00:25.377 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 03 December 2023 04:30:59 +0000 (0:00:00.015) 0:00:25.393 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 03 December 2023 04:30:59 +0000 (0:00:00.025) 0:00:25.418 ******* changed: [sut] => { "changed": true, "rc": 0, "results": [ "Installed: cryptsetup-2.6.1-1.fc37.x86_64" ] } lsrpackages: cryptsetup kpartx TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 03 December 2023 04:31:03 +0000 (0:00:04.174) 0:00:29.592 ******* ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 03 December 2023 04:31:05 +0000 (0:00:02.036) 0:00:31.629 ******* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 03 December 2023 04:31:05 +0000 (0:00:00.024) 0:00:31.654 ******* TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 03 December 2023 04:31:05 +0000 (0:00:00.013) 0:00:31.667 ******* fatal: [sut]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [linux-system-roles.storage : Failed message] ***************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:106 Sunday 03 December 2023 04:31:07 +0000 (0:00:01.550) 0:00:33.217 ******* fatal: [sut]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'foo' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 03 December 2023 04:31:07 +0000 (0:00:00.019) 0:00:33.237 ******* TASK [Check that we failed in the role] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:29 Sunday 03 December 2023 04:31:07 +0000 (0:00:00.012) 0:00:33.249 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:34 Sunday 03 December 2023 04:31:07 +0000 (0:00:00.016) 0:00:33.266 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:45 Sunday 03 December 2023 04:31:07 +0000 (0:00:00.022) 0:00:33.289 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:83 Sunday 03 December 2023 04:31:07 +0000 (0:00:00.014) 0:00:33.303 ******* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 03 December 2023 04:31:07 +0000 (0:00:00.035) 0:00:33.338 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 03 December 2023 04:31:07 +0000 (0:00:00.024) 0:00:33.363 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 03 December 2023 04:31:07 +0000 (0:00:00.019) 0:00:33.382 ******* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 03 December 2023 04:31:07 +0000 (0:00:00.041) 0:00:33.424 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 03 December 2023 04:31:07 +0000 (0:00:00.015) 0:00:33.439 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 03 December 2023 04:31:07 +0000 (0:00:00.019) 0:00:33.459 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 03 December 2023 04:31:07 +0000 (0:00:00.016) 0:00:33.475 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 03 December 2023 04:31:07 +0000 (0:00:00.015) 0:00:33.491 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 03 December 2023 04:31:07 +0000 (0:00:00.035) 0:00:33.526 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 03 December 2023 04:31:09 +0000 (0:00:02.600) 0:00:36.127 ******* ok: [sut] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 03 December 2023 04:31:09 +0000 (0:00:00.018) 0:00:36.145 ******* ok: [sut] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 03 December 2023 04:31:09 +0000 (0:00:00.020) 0:00:36.166 ******* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 03 December 2023 04:31:11 +0000 (0:00:01.510) 0:00:37.676 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 03 December 2023 04:31:11 +0000 (0:00:00.027) 0:00:37.704 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 03 December 2023 04:31:11 +0000 (0:00:00.067) 0:00:37.772 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 03 December 2023 04:31:11 +0000 (0:00:00.016) 0:00:37.788 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 03 December 2023 04:31:11 +0000 (0:00:00.024) 0:00:37.812 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 03 December 2023 04:31:14 +0000 (0:00:02.609) 0:00:40.421 ******* ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 03 December 2023 04:31:16 +0000 (0:00:02.051) 0:00:42.473 ******* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 03 December 2023 04:31:16 +0000 (0:00:00.023) 0:00:42.497 ******* TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 03 December 2023 04:31:16 +0000 (0:00:00.013) 0:00:42.510 ******* changed: [sut] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-528ac4f2-6d85-441c-9883-964514188387", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:98 Sunday 03 December 2023 04:31:27 +0000 (0:00:11.425) 0:00:53.935 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 03 December 2023 04:31:27 +0000 (0:00:00.014) 0:00:53.950 ******* TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:116 Sunday 03 December 2023 04:31:27 +0000 (0:00:00.012) 0:00:53.962 ******* ok: [sut] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-528ac4f2-6d85-441c-9883-964514188387", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:121 Sunday 03 December 2023 04:31:27 +0000 (0:00:00.018) 0:00:53.981 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:125 Sunday 03 December 2023 04:31:27 +0000 (0:00:00.016) 0:00:53.998 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:141 Sunday 03 December 2023 04:31:27 +0000 (0:00:00.016) 0:00:54.014 ******* TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:153 Sunday 03 December 2023 04:31:27 +0000 (0:00:00.013) 0:00:54.028 ******* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:158 Sunday 03 December 2023 04:31:28 +0000 (0:00:00.936) 0:00:54.964 ******* changed: [sut] => (item={'src': '/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387" } TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:170 Sunday 03 December 2023 04:31:29 +0000 (0:00:00.902) 0:00:55.867 ******* skipping: [sut] => (item={'src': '/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:185 Sunday 03 December 2023 04:31:29 +0000 (0:00:00.020) 0:00:55.887 ******* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:193 Sunday 03 December 2023 04:31:30 +0000 (0:00:00.815) 0:00:56.703 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701577272.732126, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1696931323.96, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131081, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1696930864.766, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "781860367", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Sunday 03 December 2023 04:31:30 +0000 (0:00:00.213) 0:00:56.916 ******* changed: [sut] => (item={'backing_device': '/dev/sda', 'name': 'luks-528ac4f2-6d85-441c-9883-964514188387', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-528ac4f2-6d85-441c-9883-964514188387", "password": "-", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:220 Sunday 03 December 2023 04:31:31 +0000 (0:00:00.323) 0:00:57.240 ******* ok: [sut] TASK [Verify role results] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:95 Sunday 03 December 2023 04:31:31 +0000 (0:00:00.812) 0:00:58.052 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml for sut TASK [Print out pool information] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:2 Sunday 03 December 2023 04:31:31 +0000 (0:00:00.029) 0:00:58.082 ******* skipping: [sut] => {} TASK [Print out volume information] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:7 Sunday 03 December 2023 04:31:31 +0000 (0:00:00.016) 0:00:58.098 ******* ok: [sut] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:15 Sunday 03 December 2023 04:31:31 +0000 (0:00:00.050) 0:00:58.149 ******* ok: [sut] => { "changed": false, "info": { "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "size": "10G", "type": "crypt", "uuid": "d348ab55-aa46-455f-8589-5a73695b59df" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "528ac4f2-6d85-441c-9883-964514188387" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "e7a9b339-c47b-44f9-a3db-5567a302dcca" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:20 Sunday 03 December 2023 04:31:32 +0000 (0:00:00.279) 0:00:58.428 ******* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003324", "end": "2023-12-03 04:31:32.480550", "rc": 0, "start": "2023-12-03 04:31:32.477226" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Oct 10 09:41:04 2023 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=e7a9b339-c47b-44f9-a3db-5567a302dcca / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:25 Sunday 03 December 2023 04:31:32 +0000 (0:00:00.279) 0:00:58.708 ******* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003327", "end": "2023-12-03 04:31:32.694724", "failed_when_result": false, "rc": 0, "start": "2023-12-03 04:31:32.691397" } STDOUT: luks-528ac4f2-6d85-441c-9883-964514188387 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:34 Sunday 03 December 2023 04:31:32 +0000 (0:00:00.211) 0:00:58.920 ******* TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:44 Sunday 03 December 2023 04:31:32 +0000 (0:00:00.013) 0:00:58.933 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml for sut TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:2 Sunday 03 December 2023 04:31:32 +0000 (0:00:00.029) 0:00:58.962 ******* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:21 Sunday 03 December 2023 04:31:32 +0000 (0:00:00.017) 0:00:58.980 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:7 Sunday 03 December 2023 04:31:32 +0000 (0:00:00.073) 0:00:59.054 ******* ok: [sut] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:16 Sunday 03 December 2023 04:31:32 +0000 (0:00:00.021) 0:00:59.075 ******* ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2574562, "block_size": 4096, "block_total": 2600960, "block_used": 26398, "device": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "fstype": "xfs", "inode_available": 5234685, "inode_total": 5234688, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10545405952, "size_total": 10653532160, "uuid": "d348ab55-aa46-455f-8589-5a73695b59df" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2574562, "block_size": 4096, "block_total": 2600960, "block_used": 26398, "device": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "fstype": "xfs", "inode_available": 5234685, "inode_total": 5234688, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10545405952, "size_total": 10653532160, "uuid": "d348ab55-aa46-455f-8589-5a73695b59df" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:38 Sunday 03 December 2023 04:31:32 +0000 (0:00:00.029) 0:00:59.105 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:51 Sunday 03 December 2023 04:31:32 +0000 (0:00:00.017) 0:00:59.123 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:63 Sunday 03 December 2023 04:31:32 +0000 (0:00:00.025) 0:00:59.149 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:71 Sunday 03 December 2023 04:31:32 +0000 (0:00:00.020) 0:00:59.169 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:83 Sunday 03 December 2023 04:31:32 +0000 (0:00:00.016) 0:00:59.186 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:95 Sunday 03 December 2023 04:31:33 +0000 (0:00:00.016) 0:00:59.202 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:110 Sunday 03 December 2023 04:31:33 +0000 (0:00:00.019) 0:00:59.222 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:122 Sunday 03 December 2023 04:31:33 +0000 (0:00:00.024) 0:00:59.247 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:128 Sunday 03 December 2023 04:31:33 +0000 (0:00:00.016) 0:00:59.264 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:134 Sunday 03 December 2023 04:31:33 +0000 (0:00:00.015) 0:00:59.280 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:146 Sunday 03 December 2023 04:31:33 +0000 (0:00:00.015) 0:00:59.295 ******* ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:2 Sunday 03 December 2023 04:31:33 +0000 (0:00:00.014) 0:00:59.309 ******* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:40 Sunday 03 December 2023 04:31:33 +0000 (0:00:00.030) 0:00:59.340 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:48 Sunday 03 December 2023 04:31:33 +0000 (0:00:00.018) 0:00:59.358 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:58 Sunday 03 December 2023 04:31:33 +0000 (0:00:00.016) 0:00:59.375 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:71 Sunday 03 December 2023 04:31:33 +0000 (0:00:00.013) 0:00:59.389 ******* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml:3 Sunday 03 December 2023 04:31:33 +0000 (0:00:00.013) 0:00:59.403 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml:12 Sunday 03 December 2023 04:31:33 +0000 (0:00:00.018) 0:00:59.421 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:3 Sunday 03 December 2023 04:31:33 +0000 (0:00:00.018) 0:00:59.440 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701577891.0029397, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1701577887.100963, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 560, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1701577887.100963, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:9 Sunday 03 December 2023 04:31:33 +0000 (0:00:00.216) 0:00:59.656 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:16 Sunday 03 December 2023 04:31:33 +0000 (0:00:00.020) 0:00:59.677 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:24 Sunday 03 December 2023 04:31:33 +0000 (0:00:00.014) 0:00:59.691 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:30 Sunday 03 December 2023 04:31:33 +0000 (0:00:00.018) 0:00:59.710 ******* ok: [sut] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:34 Sunday 03 December 2023 04:31:33 +0000 (0:00:00.015) 0:00:59.726 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:39 Sunday 03 December 2023 04:31:33 +0000 (0:00:00.014) 0:00:59.741 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:3 Sunday 03 December 2023 04:31:33 +0000 (0:00:00.018) 0:00:59.759 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701577891.0049396, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1701577887.6119602, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 744, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1701577887.6119602, "nlink": 1, "path": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:10 Sunday 03 December 2023 04:31:33 +0000 (0:00:00.219) 0:00:59.979 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:17 Sunday 03 December 2023 04:31:36 +0000 (0:00:02.584) 0:01:02.563 ******* ok: [sut] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.007840", "end": "2023-12-03 04:31:36.557443", "rc": 0, "start": "2023-12-03 04:31:36.549603" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 528ac4f2-6d85-441c-9883-964514188387 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 728183 Threads: 2 Salt: a7 0a 46 9b 25 22 f5 fa d2 98 a4 4b b6 c2 cb a5 33 e1 f9 83 d3 d4 ac f2 78 3e 44 04 b4 6f 9d 69 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 93891 Salt: fc ba 5e 5e a4 3c 61 2b 0a ea 38 b9 c8 6b 71 d8 59 35 2c 97 57 c8 db b0 b2 32 1a 52 f2 39 43 2a Digest: 22 4e a1 88 6c 3d 83 d1 2c 58 e4 18 4c 19 19 f3 46 c8 52 2c 8c 6e 71 d6 d5 5f f4 80 51 6c fd 96 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:23 Sunday 03 December 2023 04:31:36 +0000 (0:00:00.218) 0:01:02.782 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:32 Sunday 03 December 2023 04:31:36 +0000 (0:00:00.025) 0:01:02.808 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:45 Sunday 03 December 2023 04:31:36 +0000 (0:00:00.024) 0:01:02.832 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:51 Sunday 03 December 2023 04:31:36 +0000 (0:00:00.021) 0:01:02.853 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:56 Sunday 03 December 2023 04:31:36 +0000 (0:00:00.020) 0:01:02.873 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:69 Sunday 03 December 2023 04:31:36 +0000 (0:00:00.017) 0:01:02.890 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:81 Sunday 03 December 2023 04:31:36 +0000 (0:00:00.016) 0:01:02.907 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:94 Sunday 03 December 2023 04:31:36 +0000 (0:00:00.017) 0:01:02.925 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-528ac4f2-6d85-441c-9883-964514188387 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:106 Sunday 03 December 2023 04:31:36 +0000 (0:00:00.021) 0:01:02.947 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:114 Sunday 03 December 2023 04:31:36 +0000 (0:00:00.020) 0:01:02.967 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:122 Sunday 03 December 2023 04:31:36 +0000 (0:00:00.022) 0:01:02.990 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:131 Sunday 03 December 2023 04:31:36 +0000 (0:00:00.024) 0:01:03.014 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:140 Sunday 03 December 2023 04:31:36 +0000 (0:00:00.025) 0:01:03.039 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:8 Sunday 03 December 2023 04:31:36 +0000 (0:00:00.017) 0:01:03.057 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:14 Sunday 03 December 2023 04:31:36 +0000 (0:00:00.016) 0:01:03.073 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:21 Sunday 03 December 2023 04:31:36 +0000 (0:00:00.016) 0:01:03.090 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:28 Sunday 03 December 2023 04:31:36 +0000 (0:00:00.016) 0:01:03.106 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:35 Sunday 03 December 2023 04:31:36 +0000 (0:00:00.015) 0:01:03.121 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:45 Sunday 03 December 2023 04:31:36 +0000 (0:00:00.017) 0:01:03.139 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:54 Sunday 03 December 2023 04:31:36 +0000 (0:00:00.018) 0:01:03.157 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:63 Sunday 03 December 2023 04:31:36 +0000 (0:00:00.016) 0:01:03.173 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:72 Sunday 03 December 2023 04:31:36 +0000 (0:00:00.017) 0:01:03.191 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:81 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.015) 0:01:03.207 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:3 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.016) 0:01:03.223 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:11 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.017) 0:01:03.240 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:20 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.019) 0:01:03.260 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:28 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.016) 0:01:03.277 ******* ok: [sut] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:32 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.017) 0:01:03.294 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:46 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.017) 0:01:03.312 ******* skipping: [sut] => {} TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:50 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.016) 0:01:03.328 ******* skipping: [sut] => {} TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:54 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.015) 0:01:03.343 ******* skipping: [sut] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:58 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.018) 0:01:03.361 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:68 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.017) 0:01:03.378 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:72 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.015) 0:01:03.394 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:77 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.014) 0:01:03.409 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:83 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.014) 0:01:03.424 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:88 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.014) 0:01:03.438 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:96 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.016) 0:01:03.454 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:104 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.015) 0:01:03.470 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:109 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.015) 0:01:03.485 ******* skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:113 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.015) 0:01:03.501 ******* skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:117 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.015) 0:01:03.517 ******* skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:121 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.016) 0:01:03.533 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:129 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.017) 0:01:03.550 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:138 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.016) 0:01:03.567 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:142 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.015) 0:01:03.582 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:150 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.015) 0:01:03.598 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:156 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.019) 0:01:03.617 ******* ok: [sut] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:160 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.017) 0:01:03.635 ******* ok: [sut] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:164 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.018) 0:01:03.654 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:5 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.017) 0:01:03.672 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:13 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.019) 0:01:03.691 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:18 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.019) 0:01:03.711 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:27 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.015) 0:01:03.726 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:35 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.015) 0:01:03.742 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:41 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.017) 0:01:03.760 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:47 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.015) 0:01:03.775 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:27 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.014) 0:01:03.789 ******* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:54 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.014) 0:01:03.804 ******* ok: [sut] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/create-test-file.yml:12 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.014) 0:01:03.818 ******* changed: [sut] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:101 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.315) 0:01:04.133 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml for sut TASK [Store global variable value copy] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:4 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.030) 0:01:04.164 ******* ok: [sut] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:10 Sunday 03 December 2023 04:31:37 +0000 (0:00:00.021) 0:01:04.185 ******* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 03 December 2023 04:31:38 +0000 (0:00:00.021) 0:01:04.207 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 03 December 2023 04:31:38 +0000 (0:00:00.050) 0:01:04.257 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 03 December 2023 04:31:38 +0000 (0:00:00.021) 0:01:04.278 ******* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 03 December 2023 04:31:38 +0000 (0:00:00.041) 0:01:04.320 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 03 December 2023 04:31:38 +0000 (0:00:00.016) 0:01:04.337 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 03 December 2023 04:31:38 +0000 (0:00:00.017) 0:01:04.355 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 03 December 2023 04:31:38 +0000 (0:00:00.016) 0:01:04.371 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 03 December 2023 04:31:38 +0000 (0:00:00.014) 0:01:04.386 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 03 December 2023 04:31:38 +0000 (0:00:00.034) 0:01:04.420 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 03 December 2023 04:31:40 +0000 (0:00:02.599) 0:01:07.019 ******* ok: [sut] => { "storage_pools": [] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 03 December 2023 04:31:40 +0000 (0:00:00.021) 0:01:07.040 ******* ok: [sut] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 03 December 2023 04:31:40 +0000 (0:00:00.026) 0:01:07.067 ******* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 03 December 2023 04:31:42 +0000 (0:00:01.678) 0:01:08.745 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 03 December 2023 04:31:42 +0000 (0:00:00.031) 0:01:08.777 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 03 December 2023 04:31:42 +0000 (0:00:00.024) 0:01:08.802 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 03 December 2023 04:31:42 +0000 (0:00:00.014) 0:01:08.816 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 03 December 2023 04:31:42 +0000 (0:00:00.023) 0:01:08.840 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 03 December 2023 04:31:45 +0000 (0:00:02.975) 0:01:11.816 ******* ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 03 December 2023 04:31:47 +0000 (0:00:02.057) 0:01:13.873 ******* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 03 December 2023 04:31:47 +0000 (0:00:00.024) 0:01:13.898 ******* TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 03 December 2023 04:31:47 +0000 (0:00:00.012) 0:01:13.910 ******* fatal: [sut]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-528ac4f2-6d85-441c-9883-964514188387' in safe mode due to encryption removal TASK [linux-system-roles.storage : Failed message] ***************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:106 Sunday 03 December 2023 04:31:49 +0000 (0:00:01.731) 0:01:15.642 ******* fatal: [sut]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-528ac4f2-6d85-441c-9883-964514188387' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'yabbadabbadoo', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10720641024, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 03 December 2023 04:31:49 +0000 (0:00:00.020) 0:01:15.662 ******* TASK [Check that we failed in the role] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:29 Sunday 03 December 2023 04:31:49 +0000 (0:00:00.013) 0:01:15.676 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:34 Sunday 03 December 2023 04:31:49 +0000 (0:00:00.016) 0:01:15.692 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:45 Sunday 03 December 2023 04:31:49 +0000 (0:00:00.022) 0:01:15.715 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-data-preservation.yml:11 Sunday 03 December 2023 04:31:49 +0000 (0:00:00.014) 0:01:15.730 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701577897.9068975, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1701577897.9068975, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1701577897.9068975, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "154520704", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-data-preservation.yml:16 Sunday 03 December 2023 04:31:49 +0000 (0:00:00.214) 0:01:15.944 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:121 Sunday 03 December 2023 04:31:49 +0000 (0:00:00.019) 0:01:15.963 ******* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 03 December 2023 04:31:49 +0000 (0:00:00.043) 0:01:16.007 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 03 December 2023 04:31:49 +0000 (0:00:00.021) 0:01:16.029 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 03 December 2023 04:31:49 +0000 (0:00:00.017) 0:01:16.047 ******* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 03 December 2023 04:31:49 +0000 (0:00:00.038) 0:01:16.085 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 03 December 2023 04:31:49 +0000 (0:00:00.014) 0:01:16.099 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 03 December 2023 04:31:49 +0000 (0:00:00.014) 0:01:16.113 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 03 December 2023 04:31:49 +0000 (0:00:00.014) 0:01:16.128 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 03 December 2023 04:31:49 +0000 (0:00:00.014) 0:01:16.142 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 03 December 2023 04:31:49 +0000 (0:00:00.029) 0:01:16.172 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 03 December 2023 04:31:52 +0000 (0:00:02.601) 0:01:18.773 ******* ok: [sut] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 03 December 2023 04:31:52 +0000 (0:00:00.016) 0:01:18.790 ******* ok: [sut] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 03 December 2023 04:31:52 +0000 (0:00:00.018) 0:01:18.808 ******* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 03 December 2023 04:31:54 +0000 (0:00:01.674) 0:01:20.483 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 03 December 2023 04:31:54 +0000 (0:00:00.037) 0:01:20.520 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 03 December 2023 04:31:54 +0000 (0:00:00.023) 0:01:20.544 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 03 December 2023 04:31:54 +0000 (0:00:00.014) 0:01:20.558 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 03 December 2023 04:31:54 +0000 (0:00:00.022) 0:01:20.581 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 03 December 2023 04:31:56 +0000 (0:00:02.583) 0:01:23.165 ******* ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 03 December 2023 04:31:59 +0000 (0:00:02.065) 0:01:25.231 ******* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 03 December 2023 04:31:59 +0000 (0:00:00.024) 0:01:25.256 ******* TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 03 December 2023 04:31:59 +0000 (0:00:00.013) 0:01:25.269 ******* changed: [sut] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-528ac4f2-6d85-441c-9883-964514188387", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=721e8cad-cf9e-4876-9972-000beb8a1c3f", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=721e8cad-cf9e-4876-9972-000beb8a1c3f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:98 Sunday 03 December 2023 04:32:01 +0000 (0:00:02.190) 0:01:27.459 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 03 December 2023 04:32:01 +0000 (0:00:00.014) 0:01:27.474 ******* TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:116 Sunday 03 December 2023 04:32:01 +0000 (0:00:00.013) 0:01:27.487 ******* ok: [sut] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-528ac4f2-6d85-441c-9883-964514188387", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=721e8cad-cf9e-4876-9972-000beb8a1c3f", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=721e8cad-cf9e-4876-9972-000beb8a1c3f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:121 Sunday 03 December 2023 04:32:01 +0000 (0:00:00.018) 0:01:27.506 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:125 Sunday 03 December 2023 04:32:01 +0000 (0:00:00.017) 0:01:27.524 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=721e8cad-cf9e-4876-9972-000beb8a1c3f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:141 Sunday 03 December 2023 04:32:01 +0000 (0:00:00.021) 0:01:27.545 ******* changed: [sut] => (item={'src': '/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-528ac4f2-6d85-441c-9883-964514188387" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:153 Sunday 03 December 2023 04:32:01 +0000 (0:00:00.225) 0:01:27.771 ******* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:158 Sunday 03 December 2023 04:32:02 +0000 (0:00:00.797) 0:01:28.568 ******* changed: [sut] => (item={'src': 'UUID=721e8cad-cf9e-4876-9972-000beb8a1c3f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=721e8cad-cf9e-4876-9972-000beb8a1c3f", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=721e8cad-cf9e-4876-9972-000beb8a1c3f" } TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:170 Sunday 03 December 2023 04:32:02 +0000 (0:00:00.246) 0:01:28.815 ******* skipping: [sut] => (item={'src': 'UUID=721e8cad-cf9e-4876-9972-000beb8a1c3f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=721e8cad-cf9e-4876-9972-000beb8a1c3f", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:185 Sunday 03 December 2023 04:32:02 +0000 (0:00:00.021) 0:01:28.836 ******* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:193 Sunday 03 December 2023 04:32:03 +0000 (0:00:00.787) 0:01:29.623 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701577891.0019395, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "bdd68db6cbab362fbb95927b47cb1607ade8f657", "ctime": 1701577890.9989395, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263373, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1701577890.9969397, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "546571591", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Sunday 03 December 2023 04:32:03 +0000 (0:00:00.217) 0:01:29.841 ******* changed: [sut] => (item={'backing_device': '/dev/sda', 'name': 'luks-528ac4f2-6d85-441c-9883-964514188387', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-528ac4f2-6d85-441c-9883-964514188387", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:220 Sunday 03 December 2023 04:32:03 +0000 (0:00:00.231) 0:01:30.073 ******* ok: [sut] TASK [Verify role results] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:134 Sunday 03 December 2023 04:32:04 +0000 (0:00:00.800) 0:01:30.874 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml for sut TASK [Print out pool information] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:2 Sunday 03 December 2023 04:32:04 +0000 (0:00:00.032) 0:01:30.906 ******* skipping: [sut] => {} TASK [Print out volume information] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:7 Sunday 03 December 2023 04:32:04 +0000 (0:00:00.016) 0:01:30.923 ******* ok: [sut] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=721e8cad-cf9e-4876-9972-000beb8a1c3f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:15 Sunday 03 December 2023 04:32:04 +0000 (0:00:00.020) 0:01:30.943 ******* ok: [sut] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "721e8cad-cf9e-4876-9972-000beb8a1c3f" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "e7a9b339-c47b-44f9-a3db-5567a302dcca" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:20 Sunday 03 December 2023 04:32:04 +0000 (0:00:00.212) 0:01:31.155 ******* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003310", "end": "2023-12-03 04:32:05.138828", "rc": 0, "start": "2023-12-03 04:32:05.135518" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Oct 10 09:41:04 2023 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=e7a9b339-c47b-44f9-a3db-5567a302dcca / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=721e8cad-cf9e-4876-9972-000beb8a1c3f /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:25 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.212) 0:01:31.368 ******* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003334", "end": "2023-12-03 04:32:05.354011", "failed_when_result": false, "rc": 0, "start": "2023-12-03 04:32:05.350677" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:34 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.209) 0:01:31.578 ******* TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:44 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.013) 0:01:31.591 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml for sut TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:2 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.030) 0:01:31.622 ******* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:21 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.018) 0:01:31.640 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:7 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.069) 0:01:31.709 ******* ok: [sut] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:16 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.017) 0:01:31.727 ******* ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2578630, "block_size": 4096, "block_total": 2605056, "block_used": 26426, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10562068480, "size_total": 10670309376, "uuid": "721e8cad-cf9e-4876-9972-000beb8a1c3f" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2578630, "block_size": 4096, "block_total": 2605056, "block_used": 26426, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10562068480, "size_total": 10670309376, "uuid": "721e8cad-cf9e-4876-9972-000beb8a1c3f" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:38 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.022) 0:01:31.749 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:51 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.013) 0:01:31.763 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:63 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.019) 0:01:31.782 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:71 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.018) 0:01:31.801 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:83 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.015) 0:01:31.816 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:95 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.016) 0:01:31.832 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:110 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.015) 0:01:31.848 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:122 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.019) 0:01:31.867 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:128 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.016) 0:01:31.884 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:134 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.014) 0:01:31.899 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:146 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.015) 0:01:31.914 ******* ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:2 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.016) 0:01:31.931 ******* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=721e8cad-cf9e-4876-9972-000beb8a1c3f " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:40 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.038) 0:01:31.969 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:48 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.020) 0:01:31.989 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:58 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.023) 0:01:32.012 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:71 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.016) 0:01:32.029 ******* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml:3 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.016) 0:01:32.045 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml:12 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.021) 0:01:32.067 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:3 Sunday 03 December 2023 04:32:05 +0000 (0:00:00.019) 0:01:32.087 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701577923.8357391, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1701577921.1687555, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 560, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1701577921.1687555, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:9 Sunday 03 December 2023 04:32:06 +0000 (0:00:00.216) 0:01:32.303 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:16 Sunday 03 December 2023 04:32:06 +0000 (0:00:00.021) 0:01:32.325 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:24 Sunday 03 December 2023 04:32:06 +0000 (0:00:00.016) 0:01:32.342 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:30 Sunday 03 December 2023 04:32:06 +0000 (0:00:00.019) 0:01:32.361 ******* ok: [sut] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:34 Sunday 03 December 2023 04:32:06 +0000 (0:00:00.017) 0:01:32.378 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:39 Sunday 03 December 2023 04:32:06 +0000 (0:00:00.014) 0:01:32.393 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:3 Sunday 03 December 2023 04:32:06 +0000 (0:00:00.019) 0:01:32.412 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:10 Sunday 03 December 2023 04:32:06 +0000 (0:00:00.016) 0:01:32.428 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:17 Sunday 03 December 2023 04:32:08 +0000 (0:00:02.597) 0:01:35.026 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:23 Sunday 03 December 2023 04:32:08 +0000 (0:00:00.017) 0:01:35.043 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:32 Sunday 03 December 2023 04:32:08 +0000 (0:00:00.015) 0:01:35.058 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:45 Sunday 03 December 2023 04:32:08 +0000 (0:00:00.024) 0:01:35.082 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:51 Sunday 03 December 2023 04:32:08 +0000 (0:00:00.016) 0:01:35.099 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:56 Sunday 03 December 2023 04:32:08 +0000 (0:00:00.016) 0:01:35.115 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:69 Sunday 03 December 2023 04:32:08 +0000 (0:00:00.018) 0:01:35.134 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:81 Sunday 03 December 2023 04:32:08 +0000 (0:00:00.016) 0:01:35.150 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:94 Sunday 03 December 2023 04:32:08 +0000 (0:00:00.014) 0:01:35.165 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:106 Sunday 03 December 2023 04:32:08 +0000 (0:00:00.020) 0:01:35.186 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:114 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.018) 0:01:35.204 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:122 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.015) 0:01:35.220 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:131 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.016) 0:01:35.237 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:140 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.014) 0:01:35.252 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:8 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.014) 0:01:35.266 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:14 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.015) 0:01:35.281 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:21 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.014) 0:01:35.296 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:28 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.014) 0:01:35.311 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:35 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.016) 0:01:35.327 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:45 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.014) 0:01:35.342 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:54 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.014) 0:01:35.357 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:63 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.014) 0:01:35.371 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:72 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.015) 0:01:35.386 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:81 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.014) 0:01:35.401 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:3 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.016) 0:01:35.417 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:11 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.016) 0:01:35.433 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:20 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.016) 0:01:35.450 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:28 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.016) 0:01:35.466 ******* ok: [sut] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:32 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.017) 0:01:35.483 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:46 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.015) 0:01:35.498 ******* skipping: [sut] => {} TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:50 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.018) 0:01:35.517 ******* skipping: [sut] => {} TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:54 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.015) 0:01:35.532 ******* skipping: [sut] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:58 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.015) 0:01:35.548 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:68 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.015) 0:01:35.563 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:72 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.014) 0:01:35.577 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:77 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.014) 0:01:35.591 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:83 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.015) 0:01:35.607 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:88 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.014) 0:01:35.621 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:96 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.014) 0:01:35.636 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:104 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.014) 0:01:35.651 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:109 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.014) 0:01:35.665 ******* skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:113 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.014) 0:01:35.679 ******* skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:117 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.016) 0:01:35.695 ******* skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:121 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.015) 0:01:35.711 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:129 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.017) 0:01:35.728 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:138 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.014) 0:01:35.742 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:142 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.014) 0:01:35.756 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:150 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.014) 0:01:35.771 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:156 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.015) 0:01:35.786 ******* ok: [sut] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:160 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.016) 0:01:35.803 ******* ok: [sut] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:164 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.015) 0:01:35.819 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:5 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.017) 0:01:35.836 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:13 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.014) 0:01:35.850 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:18 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.014) 0:01:35.865 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:27 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.015) 0:01:35.880 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:35 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.037) 0:01:35.918 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:41 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.016) 0:01:35.934 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:47 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.015) 0:01:35.950 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:27 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.015) 0:01:35.965 ******* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:54 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.014) 0:01:35.979 ******* ok: [sut] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/create-test-file.yml:12 Sunday 03 December 2023 04:32:09 +0000 (0:00:00.013) 0:01:35.993 ******* changed: [sut] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:140 Sunday 03 December 2023 04:32:10 +0000 (0:00:00.221) 0:01:36.214 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml for sut TASK [Store global variable value copy] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:4 Sunday 03 December 2023 04:32:10 +0000 (0:00:00.035) 0:01:36.250 ******* ok: [sut] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:10 Sunday 03 December 2023 04:32:10 +0000 (0:00:00.019) 0:01:36.269 ******* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 03 December 2023 04:32:10 +0000 (0:00:00.021) 0:01:36.291 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 03 December 2023 04:32:10 +0000 (0:00:00.022) 0:01:36.314 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 03 December 2023 04:32:10 +0000 (0:00:00.018) 0:01:36.332 ******* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 03 December 2023 04:32:10 +0000 (0:00:00.039) 0:01:36.372 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 03 December 2023 04:32:10 +0000 (0:00:00.016) 0:01:36.389 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 03 December 2023 04:32:10 +0000 (0:00:00.017) 0:01:36.406 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 03 December 2023 04:32:10 +0000 (0:00:00.014) 0:01:36.421 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 03 December 2023 04:32:10 +0000 (0:00:00.014) 0:01:36.435 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 03 December 2023 04:32:10 +0000 (0:00:00.032) 0:01:36.468 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 03 December 2023 04:32:12 +0000 (0:00:02.614) 0:01:39.083 ******* ok: [sut] => { "storage_pools": [] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 03 December 2023 04:32:12 +0000 (0:00:00.019) 0:01:39.102 ******* ok: [sut] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 03 December 2023 04:32:12 +0000 (0:00:00.019) 0:01:39.122 ******* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 03 December 2023 04:32:14 +0000 (0:00:01.558) 0:01:40.680 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 03 December 2023 04:32:14 +0000 (0:00:00.028) 0:01:40.709 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 03 December 2023 04:32:14 +0000 (0:00:00.024) 0:01:40.734 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 03 December 2023 04:32:14 +0000 (0:00:00.017) 0:01:40.751 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 03 December 2023 04:32:14 +0000 (0:00:00.025) 0:01:40.776 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 03 December 2023 04:32:17 +0000 (0:00:02.609) 0:01:43.386 ******* ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d528ac4f2\\x2d6d85\\x2d441c\\x2d9883\\x2d964514188387.service": { "name": "systemd-cryptsetup@luks\\x2d528ac4f2\\x2d6d85\\x2d441c\\x2d9883\\x2d964514188387.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 03 December 2023 04:32:19 +0000 (0:00:02.085) 0:01:45.471 ******* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d528ac4f2\\x2d6d85\\x2d441c\\x2d9883\\x2d964514188387.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 03 December 2023 04:32:19 +0000 (0:00:00.025) 0:01:45.497 ******* changed: [sut] => (item=systemd-cryptsetup@luks\x2d528ac4f2\x2d6d85\x2d441c\x2d9883\x2d964514188387.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d528ac4f2\\x2d6d85\\x2d441c\\x2d9883\\x2d964514188387.service", "name": "systemd-cryptsetup@luks\\x2d528ac4f2\\x2d6d85\\x2d441c\\x2d9883\\x2d964514188387.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-udevd-kernel.socket dev-sda.device cryptsetup-pre.target systemd-journald.socket \"system-systemd\\\\x2dcryptsetup.slice\"", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target \"blockdev@dev-mapper-luks\\\\x2d528ac4f2\\\\x2d6d85\\\\x2d441c\\\\x2d9883\\\\x2d964514188387.target\"", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-528ac4f2-6d85-441c-9883-964514188387", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-528ac4f2-6d85-441c-9883-964514188387 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-528ac4f2-6d85-441c-9883-964514188387 /dev/sda - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-528ac4f2-6d85-441c-9883-964514188387 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-528ac4f2-6d85-441c-9883-964514188387 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d528ac4f2\\x2d6d85\\x2d441c\\x2d9883\\x2d964514188387.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d528ac4f2\\x2d6d85\\x2d441c\\x2d9883\\x2d964514188387.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "14732", "LimitNPROCSoft": "14732", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14732", "LimitSIGPENDINGSoft": "14732", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d528ac4f2\\\\x2d6d85\\\\x2d441c\\\\x2d9883\\\\x2d964514188387.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2023-12-03 04:32:03 UTC", "StateChangeTimestampMonotonic": "839360127", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "4419", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d528ac4f2\\\\x2d6d85\\\\x2d441c\\\\x2d9883\\\\x2d964514188387.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 03 December 2023 04:32:20 +0000 (0:00:00.815) 0:01:46.312 ******* fatal: [sut]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [linux-system-roles.storage : Failed message] ***************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:106 Sunday 03 December 2023 04:32:21 +0000 (0:00:01.563) 0:01:47.875 ******* fatal: [sut]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'yabbadabbadoo', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 03 December 2023 04:32:21 +0000 (0:00:00.020) 0:01:47.895 ******* changed: [sut] => (item=systemd-cryptsetup@luks\x2d528ac4f2\x2d6d85\x2d441c\x2d9883\x2d964514188387.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d528ac4f2\\x2d6d85\\x2d441c\\x2d9883\\x2d964514188387.service", "name": "systemd-cryptsetup@luks\\x2d528ac4f2\\x2d6d85\\x2d441c\\x2d9883\\x2d964514188387.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d528ac4f2\\x2d6d85\\x2d441c\\x2d9883\\x2d964514188387.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d528ac4f2\\x2d6d85\\x2d441c\\x2d9883\\x2d964514188387.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d528ac4f2\\x2d6d85\\x2d441c\\x2d9883\\x2d964514188387.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "486428672", "LimitMEMLOCKSoft": "486428672", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "14732", "LimitNPROCSoft": "14732", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14732", "LimitSIGPENDINGSoft": "14732", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d528ac4f2\\x2d6d85\\x2d441c\\x2d9883\\x2d964514188387.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d528ac4f2\\\\x2d6d85\\\\x2d441c\\\\x2d9883\\\\x2d964514188387.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "4419", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:29 Sunday 03 December 2023 04:32:22 +0000 (0:00:00.820) 0:01:48.716 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:34 Sunday 03 December 2023 04:32:22 +0000 (0:00:00.018) 0:01:48.734 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:45 Sunday 03 December 2023 04:32:22 +0000 (0:00:00.022) 0:01:48.757 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-data-preservation.yml:11 Sunday 03 December 2023 04:32:22 +0000 (0:00:00.014) 0:01:48.772 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701577929.9867015, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1701577929.9867015, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1701577929.9867015, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "216609062", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-data-preservation.yml:16 Sunday 03 December 2023 04:32:22 +0000 (0:00:00.221) 0:01:48.993 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:160 Sunday 03 December 2023 04:32:22 +0000 (0:00:00.019) 0:01:49.013 ******* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 03 December 2023 04:32:22 +0000 (0:00:00.057) 0:01:49.070 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 03 December 2023 04:32:22 +0000 (0:00:00.024) 0:01:49.094 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 03 December 2023 04:32:22 +0000 (0:00:00.020) 0:01:49.115 ******* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 03 December 2023 04:32:22 +0000 (0:00:00.043) 0:01:49.158 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 03 December 2023 04:32:22 +0000 (0:00:00.018) 0:01:49.177 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 03 December 2023 04:32:22 +0000 (0:00:00.016) 0:01:49.193 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 03 December 2023 04:32:23 +0000 (0:00:00.019) 0:01:49.213 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 03 December 2023 04:32:23 +0000 (0:00:00.016) 0:01:49.230 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 03 December 2023 04:32:23 +0000 (0:00:00.035) 0:01:49.265 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 03 December 2023 04:32:26 +0000 (0:00:03.293) 0:01:52.558 ******* ok: [sut] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 03 December 2023 04:32:26 +0000 (0:00:00.017) 0:01:52.576 ******* ok: [sut] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 03 December 2023 04:32:26 +0000 (0:00:00.018) 0:01:52.595 ******* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 03 December 2023 04:32:27 +0000 (0:00:01.520) 0:01:54.115 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 03 December 2023 04:32:27 +0000 (0:00:00.029) 0:01:54.144 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 03 December 2023 04:32:27 +0000 (0:00:00.025) 0:01:54.170 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 03 December 2023 04:32:27 +0000 (0:00:00.016) 0:01:54.186 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 03 December 2023 04:32:28 +0000 (0:00:00.023) 0:01:54.209 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 03 December 2023 04:32:30 +0000 (0:00:02.590) 0:01:56.800 ******* ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 03 December 2023 04:32:32 +0000 (0:00:02.043) 0:01:58.843 ******* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 03 December 2023 04:32:32 +0000 (0:00:00.027) 0:01:58.871 ******* TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 03 December 2023 04:32:32 +0000 (0:00:00.017) 0:01:58.888 ******* changed: [sut] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=721e8cad-cf9e-4876-9972-000beb8a1c3f", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:98 Sunday 03 December 2023 04:32:43 +0000 (0:00:10.645) 0:02:09.534 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 03 December 2023 04:32:43 +0000 (0:00:00.018) 0:02:09.552 ******* TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:116 Sunday 03 December 2023 04:32:43 +0000 (0:00:00.014) 0:02:09.567 ******* ok: [sut] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=721e8cad-cf9e-4876-9972-000beb8a1c3f", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:121 Sunday 03 December 2023 04:32:43 +0000 (0:00:00.020) 0:02:09.587 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:125 Sunday 03 December 2023 04:32:43 +0000 (0:00:00.017) 0:02:09.605 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:141 Sunday 03 December 2023 04:32:43 +0000 (0:00:00.018) 0:02:09.624 ******* changed: [sut] => (item={'src': 'UUID=721e8cad-cf9e-4876-9972-000beb8a1c3f', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=721e8cad-cf9e-4876-9972-000beb8a1c3f", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=721e8cad-cf9e-4876-9972-000beb8a1c3f" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:153 Sunday 03 December 2023 04:32:43 +0000 (0:00:00.224) 0:02:09.848 ******* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:158 Sunday 03 December 2023 04:32:44 +0000 (0:00:00.791) 0:02:10.639 ******* changed: [sut] => (item={'src': '/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae" } TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:170 Sunday 03 December 2023 04:32:44 +0000 (0:00:00.247) 0:02:10.886 ******* skipping: [sut] => (item={'src': '/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:185 Sunday 03 December 2023 04:32:44 +0000 (0:00:00.021) 0:02:10.908 ******* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:193 Sunday 03 December 2023 04:32:45 +0000 (0:00:00.786) 0:02:11.695 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701577925.3527298, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1701577923.833739, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263374, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1701577923.8317392, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "2839022323", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Sunday 03 December 2023 04:32:45 +0000 (0:00:00.217) 0:02:11.912 ******* changed: [sut] => (item={'backing_device': '/dev/sda', 'name': 'luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "password": "-", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:220 Sunday 03 December 2023 04:32:45 +0000 (0:00:00.232) 0:02:12.144 ******* ok: [sut] TASK [Verify role results] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:173 Sunday 03 December 2023 04:32:46 +0000 (0:00:00.808) 0:02:12.953 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml for sut TASK [Print out pool information] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:2 Sunday 03 December 2023 04:32:46 +0000 (0:00:00.035) 0:02:12.988 ******* skipping: [sut] => {} TASK [Print out volume information] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:7 Sunday 03 December 2023 04:32:46 +0000 (0:00:00.015) 0:02:13.003 ******* ok: [sut] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:15 Sunday 03 December 2023 04:32:46 +0000 (0:00:00.019) 0:02:13.022 ******* ok: [sut] => { "changed": false, "info": { "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "size": "10G", "type": "crypt", "uuid": "1d72a9b1-66d5-4537-89b6-689082c6d956" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "e7a9b339-c47b-44f9-a3db-5567a302dcca" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:20 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.209) 0:02:13.232 ******* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003253", "end": "2023-12-03 04:32:47.216491", "rc": 0, "start": "2023-12-03 04:32:47.213238" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Oct 10 09:41:04 2023 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=e7a9b339-c47b-44f9-a3db-5567a302dcca / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:25 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.209) 0:02:13.442 ******* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003285", "end": "2023-12-03 04:32:47.428475", "failed_when_result": false, "rc": 0, "start": "2023-12-03 04:32:47.425190" } STDOUT: luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:34 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.212) 0:02:13.655 ******* TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:44 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.013) 0:02:13.669 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml for sut TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:2 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.032) 0:02:13.701 ******* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:21 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.019) 0:02:13.721 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:7 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.075) 0:02:13.797 ******* ok: [sut] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:16 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.018) 0:02:13.815 ******* ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2574562, "block_size": 4096, "block_total": 2600960, "block_used": 26398, "device": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "fstype": "xfs", "inode_available": 5234685, "inode_total": 5234688, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10545405952, "size_total": 10653532160, "uuid": "1d72a9b1-66d5-4537-89b6-689082c6d956" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2574562, "block_size": 4096, "block_total": 2600960, "block_used": 26398, "device": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "fstype": "xfs", "inode_available": 5234685, "inode_total": 5234688, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10545405952, "size_total": 10653532160, "uuid": "1d72a9b1-66d5-4537-89b6-689082c6d956" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:38 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.023) 0:02:13.838 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:51 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.014) 0:02:13.852 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:63 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.018) 0:02:13.871 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:71 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.017) 0:02:13.888 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:83 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.015) 0:02:13.904 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:95 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.014) 0:02:13.918 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:110 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.014) 0:02:13.932 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:122 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.018) 0:02:13.951 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:128 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.014) 0:02:13.966 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:134 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.014) 0:02:13.980 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:146 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.016) 0:02:13.996 ******* ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:2 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.042) 0:02:14.039 ******* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:40 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.033) 0:02:14.072 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:48 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.019) 0:02:14.091 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:58 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.019) 0:02:14.111 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:71 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.020) 0:02:14.132 ******* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml:3 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.018) 0:02:14.150 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml:12 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.022) 0:02:14.173 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:3 Sunday 03 December 2023 04:32:47 +0000 (0:00:00.021) 0:02:14.195 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701577965.9084823, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1701577962.9715002, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 560, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1701577962.9715002, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:9 Sunday 03 December 2023 04:32:48 +0000 (0:00:00.214) 0:02:14.409 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:16 Sunday 03 December 2023 04:32:48 +0000 (0:00:00.020) 0:02:14.429 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:24 Sunday 03 December 2023 04:32:48 +0000 (0:00:00.015) 0:02:14.445 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:30 Sunday 03 December 2023 04:32:48 +0000 (0:00:00.018) 0:02:14.463 ******* ok: [sut] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:34 Sunday 03 December 2023 04:32:48 +0000 (0:00:00.017) 0:02:14.480 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:39 Sunday 03 December 2023 04:32:48 +0000 (0:00:00.014) 0:02:14.495 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:3 Sunday 03 December 2023 04:32:48 +0000 (0:00:00.018) 0:02:14.514 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701577965.9104824, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1701577963.209499, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 898, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1701577963.209499, "nlink": 1, "path": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:10 Sunday 03 December 2023 04:32:48 +0000 (0:00:00.216) 0:02:14.731 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:17 Sunday 03 December 2023 04:32:51 +0000 (0:00:02.615) 0:02:17.346 ******* ok: [sut] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.007976", "end": "2023-12-03 04:32:51.340175", "rc": 0, "start": "2023-12-03 04:32:51.332199" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 729322 Threads: 2 Salt: 56 04 81 55 53 f8 cf ef 1b 4a 0c 61 7b 68 02 8c 2a 59 72 47 69 8f 4e d1 2b ea 81 56 ab 4b 3f 18 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 93891 Salt: 53 2f 5e 65 8e 04 a8 7e 51 25 27 52 b9 3b bb 7d 54 63 9b ec 9a 20 06 15 45 8a f7 fb f1 80 30 66 Digest: 62 17 d3 ee 51 68 f3 b6 8e 03 85 94 34 dc ab c9 f7 92 a3 f0 9f 56 a6 bf 89 d1 d9 c4 03 b5 22 cf TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:23 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.222) 0:02:17.569 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:32 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.025) 0:02:17.594 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:45 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.024) 0:02:17.618 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:51 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.019) 0:02:17.638 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:56 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.019) 0:02:17.658 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:69 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.015) 0:02:17.673 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:81 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.015) 0:02:17.689 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:94 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.015) 0:02:17.704 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:106 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.022) 0:02:17.727 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:114 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.018) 0:02:17.746 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:122 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.020) 0:02:17.766 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:131 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.021) 0:02:17.788 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:140 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.022) 0:02:17.811 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:8 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.017) 0:02:17.829 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:14 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.018) 0:02:17.848 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:21 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.018) 0:02:17.867 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:28 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.020) 0:02:17.887 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:35 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.017) 0:02:17.904 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:45 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.017) 0:02:17.921 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:54 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.015) 0:02:17.937 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:63 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.021) 0:02:17.959 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:72 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.016) 0:02:17.975 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:81 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.016) 0:02:17.992 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:3 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.016) 0:02:18.008 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:11 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.020) 0:02:18.029 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:20 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.019) 0:02:18.048 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:28 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.020) 0:02:18.069 ******* ok: [sut] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:32 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.017) 0:02:18.086 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:46 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.016) 0:02:18.103 ******* skipping: [sut] => {} TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:50 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.016) 0:02:18.119 ******* skipping: [sut] => {} TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:54 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.016) 0:02:18.136 ******* skipping: [sut] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:58 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.015) 0:02:18.152 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:68 Sunday 03 December 2023 04:32:51 +0000 (0:00:00.016) 0:02:18.169 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:72 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.035) 0:02:18.204 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:77 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.016) 0:02:18.221 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:83 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.014) 0:02:18.235 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:88 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.014) 0:02:18.249 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:96 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.013) 0:02:18.263 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:104 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.014) 0:02:18.277 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:109 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.017) 0:02:18.295 ******* skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:113 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.014) 0:02:18.310 ******* skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:117 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.014) 0:02:18.325 ******* skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:121 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.014) 0:02:18.340 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:129 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.016) 0:02:18.356 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:138 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.016) 0:02:18.373 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:142 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.019) 0:02:18.392 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:150 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.016) 0:02:18.409 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:156 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.016) 0:02:18.425 ******* ok: [sut] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:160 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.018) 0:02:18.443 ******* ok: [sut] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:164 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.018) 0:02:18.462 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:5 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.017) 0:02:18.479 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:13 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.018) 0:02:18.498 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:18 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.016) 0:02:18.515 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:27 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.017) 0:02:18.532 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:35 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.019) 0:02:18.552 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:41 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.021) 0:02:18.574 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:47 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.019) 0:02:18.593 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:27 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.025) 0:02:18.619 ******* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:54 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.018) 0:02:18.638 ******* ok: [sut] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:180 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.016) 0:02:18.654 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml for sut TASK [Store global variable value copy] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:4 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.052) 0:02:18.706 ******* ok: [sut] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:10 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.031) 0:02:18.738 ******* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.036) 0:02:18.774 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.029) 0:02:18.803 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.022) 0:02:18.825 ******* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.051) 0:02:18.877 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.022) 0:02:18.900 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.019) 0:02:18.920 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.016) 0:02:18.937 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.016) 0:02:18.953 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 03 December 2023 04:32:52 +0000 (0:00:00.039) 0:02:18.993 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 03 December 2023 04:32:55 +0000 (0:00:02.612) 0:02:21.606 ******* ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 03 December 2023 04:32:55 +0000 (0:00:00.022) 0:02:21.628 ******* ok: [sut] => { "storage_volumes": [] } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 03 December 2023 04:32:55 +0000 (0:00:00.021) 0:02:21.649 ******* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 03 December 2023 04:32:57 +0000 (0:00:01.695) 0:02:23.344 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 03 December 2023 04:32:57 +0000 (0:00:00.055) 0:02:23.399 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 03 December 2023 04:32:57 +0000 (0:00:00.024) 0:02:23.424 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 03 December 2023 04:32:57 +0000 (0:00:00.014) 0:02:23.439 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 03 December 2023 04:32:57 +0000 (0:00:00.022) 0:02:23.461 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 03 December 2023 04:32:59 +0000 (0:00:02.599) 0:02:26.061 ******* ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 03 December 2023 04:33:01 +0000 (0:00:02.076) 0:02:28.137 ******* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 03 December 2023 04:33:01 +0000 (0:00:00.027) 0:02:28.165 ******* TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 03 December 2023 04:33:01 +0000 (0:00:00.017) 0:02:28.182 ******* fatal: [sut]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [linux-system-roles.storage : Failed message] ***************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:106 Sunday 03 December 2023 04:33:03 +0000 (0:00:01.908) 0:02:30.091 ******* fatal: [sut]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'test1' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 03 December 2023 04:33:03 +0000 (0:00:00.020) 0:02:30.111 ******* TASK [Check that we failed in the role] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:29 Sunday 03 December 2023 04:33:03 +0000 (0:00:00.013) 0:02:30.124 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:34 Sunday 03 December 2023 04:33:03 +0000 (0:00:00.018) 0:02:30.143 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:45 Sunday 03 December 2023 04:33:03 +0000 (0:00:00.028) 0:02:30.172 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:199 Sunday 03 December 2023 04:33:03 +0000 (0:00:00.016) 0:02:30.188 ******* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 03 December 2023 04:33:04 +0000 (0:00:00.061) 0:02:30.250 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 03 December 2023 04:33:04 +0000 (0:00:00.025) 0:02:30.275 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 03 December 2023 04:33:04 +0000 (0:00:00.019) 0:02:30.295 ******* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 03 December 2023 04:33:04 +0000 (0:00:00.040) 0:02:30.335 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 03 December 2023 04:33:04 +0000 (0:00:00.014) 0:02:30.350 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 03 December 2023 04:33:04 +0000 (0:00:00.014) 0:02:30.364 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 03 December 2023 04:33:04 +0000 (0:00:00.013) 0:02:30.378 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 03 December 2023 04:33:04 +0000 (0:00:00.041) 0:02:30.420 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 03 December 2023 04:33:04 +0000 (0:00:00.035) 0:02:30.455 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 03 December 2023 04:33:06 +0000 (0:00:02.618) 0:02:33.073 ******* ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 03 December 2023 04:33:06 +0000 (0:00:00.019) 0:02:33.092 ******* ok: [sut] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 03 December 2023 04:33:06 +0000 (0:00:00.016) 0:02:33.109 ******* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 03 December 2023 04:33:08 +0000 (0:00:01.716) 0:02:34.825 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 03 December 2023 04:33:08 +0000 (0:00:00.030) 0:02:34.856 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 03 December 2023 04:33:08 +0000 (0:00:00.024) 0:02:34.880 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 03 December 2023 04:33:08 +0000 (0:00:00.015) 0:02:34.896 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 03 December 2023 04:33:08 +0000 (0:00:00.023) 0:02:34.919 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 03 December 2023 04:33:11 +0000 (0:00:02.604) 0:02:37.524 ******* ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 03 December 2023 04:33:13 +0000 (0:00:02.038) 0:02:39.562 ******* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 03 December 2023 04:33:13 +0000 (0:00:00.026) 0:02:39.588 ******* TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 03 December 2023 04:33:13 +0000 (0:00:00.014) 0:02:39.603 ******* changed: [sut] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "state": "mounted" } ], "packages": [ "e2fsprogs", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:98 Sunday 03 December 2023 04:33:25 +0000 (0:00:11.704) 0:02:51.307 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 03 December 2023 04:33:25 +0000 (0:00:00.016) 0:02:51.324 ******* TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:116 Sunday 03 December 2023 04:33:25 +0000 (0:00:00.014) 0:02:51.338 ******* ok: [sut] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "state": "mounted" } ], "packages": [ "e2fsprogs", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:121 Sunday 03 December 2023 04:33:25 +0000 (0:00:00.021) 0:02:51.360 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:125 Sunday 03 December 2023 04:33:25 +0000 (0:00:00.018) 0:02:51.378 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:141 Sunday 03 December 2023 04:33:25 +0000 (0:00:00.016) 0:02:51.395 ******* changed: [sut] => (item={'src': '/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:153 Sunday 03 December 2023 04:33:25 +0000 (0:00:00.223) 0:02:51.619 ******* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:158 Sunday 03 December 2023 04:33:26 +0000 (0:00:00.798) 0:02:52.417 ******* changed: [sut] => (item={'src': '/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9" } TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:170 Sunday 03 December 2023 04:33:26 +0000 (0:00:00.244) 0:02:52.661 ******* skipping: [sut] => (item={'src': '/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:185 Sunday 03 December 2023 04:33:26 +0000 (0:00:00.021) 0:02:52.682 ******* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:193 Sunday 03 December 2023 04:33:27 +0000 (0:00:00.788) 0:02:53.471 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701577965.9074824, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "81dfa8712ec8376168c2969bf96c8f1e127bf8d0", "ctime": 1701577965.9044824, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263373, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1701577965.9034824, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "2613474082", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Sunday 03 December 2023 04:33:27 +0000 (0:00:00.219) 0:02:53.691 ******* changed: [sut] => (item={'backing_device': '/dev/sda', 'name': 'luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [sut] => (item={'backing_device': '/dev/sda1', 'name': 'luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "password": "-", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:220 Sunday 03 December 2023 04:33:27 +0000 (0:00:00.443) 0:02:54.134 ******* ok: [sut] TASK [Verify role results] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:216 Sunday 03 December 2023 04:33:28 +0000 (0:00:00.805) 0:02:54.939 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml for sut TASK [Print out pool information] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:2 Sunday 03 December 2023 04:33:28 +0000 (0:00:00.039) 0:02:54.979 ******* ok: [sut] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:7 Sunday 03 December 2023 04:33:28 +0000 (0:00:00.030) 0:02:55.009 ******* skipping: [sut] => {} TASK [Collect info about the volumes.] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:15 Sunday 03 December 2023 04:33:28 +0000 (0:00:00.017) 0:02:55.027 ******* ok: [sut] => { "changed": false, "info": { "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "size": "10G", "type": "crypt", "uuid": "d7a7e4bc-3176-4e50-89cc-614679b6c456" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "daa28e2b-e938-4a40-b4b9-df3ad165c5d9" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "e7a9b339-c47b-44f9-a3db-5567a302dcca" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:20 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.217) 0:02:55.244 ******* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003366", "end": "2023-12-03 04:33:29.229337", "rc": 0, "start": "2023-12-03 04:33:29.225971" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Oct 10 09:41:04 2023 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=e7a9b339-c47b-44f9-a3db-5567a302dcca / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:25 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.211) 0:02:55.455 ******* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003314", "end": "2023-12-03 04:33:29.438775", "failed_when_result": false, "rc": 0, "start": "2023-12-03 04:33:29.435461" } STDOUT: luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9 /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:34 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.209) 0:02:55.665 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool.yml for sut TASK [Set _storage_pool_tests] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool.yml:5 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.032) 0:02:55.698 ******* ok: [sut] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Verify pool subset] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool.yml:18 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.016) 0:02:55.714 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-volumes.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:2 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.035) 0:02:55.749 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:13 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.015) 0:02:55.765 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:22 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.017) 0:02:55.782 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:27 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.040) 0:02:55.822 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:33 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.016) 0:02:55.839 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:42 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.016) 0:02:55.855 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:48 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.015) 0:02:55.871 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:54 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.015) 0:02:55.886 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:59 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.016) 0:02:55.902 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check MD RAID] *********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:73 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.021) 0:02:55.924 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml for sut TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:8 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.034) 0:02:55.959 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:14 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.016) 0:02:55.975 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:21 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.015) 0:02:55.991 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:28 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.014) 0:02:56.005 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:35 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.014) 0:02:56.020 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:45 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.014) 0:02:56.034 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:54 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.014) 0:02:56.049 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:64 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.014) 0:02:56.063 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:74 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.016) 0:02:56.079 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:85 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.014) 0:02:56.094 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:95 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.016) 0:02:56.110 ******* ok: [sut] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:76 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.014) 0:02:56.125 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-lvmraid.yml for sut TASK [Validate pool member LVM RAID settings] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-lvmraid.yml:2 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.031) 0:02:56.157 ******* skipping: [sut] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'yabbadabbadoo', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:79 Sunday 03 December 2023 04:33:29 +0000 (0:00:00.026) 0:02:56.183 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-thin.yml for sut TASK [Validate pool member thinpool settings] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-thin.yml:2 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.037) 0:02:56.221 ******* skipping: [sut] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'yabbadabbadoo', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:82 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.026) 0:02:56.248 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:5 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.034) 0:02:56.282 ******* ok: [sut] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:13 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.019) 0:02:56.302 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:20 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.016) 0:02:56.319 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:27 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.016) 0:02:56.335 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:85 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.016) 0:02:56.352 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-vdo.yml for sut TASK [Validate pool member VDO settings] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-vdo.yml:2 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.035) 0:02:56.387 ******* skipping: [sut] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'yabbadabbadoo', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Clean up test variables] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:88 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.022) 0:02:56.410 ******* ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-volumes.yml:3 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.014) 0:02:56.424 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml for sut TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:2 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.032) 0:02:56.457 ******* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:21 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.019) 0:02:56.476 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:7 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.076) 0:02:56.553 ******* ok: [sut] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:16 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.018) 0:02:56.572 ******* ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2572529, "block_size": 4096, "block_total": 2598912, "block_used": 26383, "device": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "fstype": "xfs", "inode_available": 5230589, "inode_total": 5230592, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10537078784, "size_total": 10645143552, "uuid": "d7a7e4bc-3176-4e50-89cc-614679b6c456" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2572529, "block_size": 4096, "block_total": 2598912, "block_used": 26383, "device": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "fstype": "xfs", "inode_available": 5230589, "inode_total": 5230592, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10537078784, "size_total": 10645143552, "uuid": "d7a7e4bc-3176-4e50-89cc-614679b6c456" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:38 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.023) 0:02:56.595 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:51 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.015) 0:02:56.611 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:63 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.021) 0:02:56.632 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:71 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.019) 0:02:56.651 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:83 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.015) 0:02:56.667 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:95 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.015) 0:02:56.683 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:110 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.014) 0:02:56.697 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:122 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.019) 0:02:56.716 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:128 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.014) 0:02:56.731 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:134 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.015) 0:02:56.746 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:146 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.042) 0:02:56.789 ******* ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:2 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.016) 0:02:56.805 ******* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:40 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.034) 0:02:56.840 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:48 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.021) 0:02:56.861 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:58 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.020) 0:02:56.882 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:71 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.018) 0:02:56.901 ******* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml:3 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.015) 0:02:56.916 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml:12 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.021) 0:02:56.937 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:3 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.021) 0:02:56.959 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578007.9002264, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1701578004.7412455, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 996, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1701578004.7412455, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:9 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.217) 0:02:57.176 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:16 Sunday 03 December 2023 04:33:30 +0000 (0:00:00.020) 0:02:57.197 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:24 Sunday 03 December 2023 04:33:31 +0000 (0:00:00.016) 0:02:57.214 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:30 Sunday 03 December 2023 04:33:31 +0000 (0:00:00.021) 0:02:57.235 ******* ok: [sut] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:34 Sunday 03 December 2023 04:33:31 +0000 (0:00:00.018) 0:02:57.254 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:39 Sunday 03 December 2023 04:33:31 +0000 (0:00:00.019) 0:02:57.273 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:3 Sunday 03 December 2023 04:33:31 +0000 (0:00:00.022) 0:02:57.296 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578007.9032264, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1701578004.975244, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1043, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1701578004.975244, "nlink": 1, "path": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:10 Sunday 03 December 2023 04:33:31 +0000 (0:00:00.220) 0:02:57.516 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:17 Sunday 03 December 2023 04:33:33 +0000 (0:00:02.634) 0:03:00.151 ******* ok: [sut] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.008397", "end": "2023-12-03 04:33:34.144890", "rc": 0, "start": "2023-12-03 04:33:34.136493" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: daa28e2b-e938-4a40-b4b9-df3ad165c5d9 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 729322 Threads: 2 Salt: 0c 2f 16 b6 38 c2 54 ec 43 81 ff 8b 34 f0 ca 4f a3 86 bb 36 5c c6 92 66 84 b1 84 97 78 f1 c0 54 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 94296 Salt: 87 68 2f 02 24 91 91 67 3a a2 58 0f 49 fc 5c 62 fa 7c 66 d7 71 5c a7 40 96 94 ec 04 d1 5c ab 18 Digest: 8f c1 eb 52 e1 8f be bc cc 50 6e 13 9c dd 7f 94 56 94 a4 22 6e af 49 98 19 93 5b 65 cc 9c a8 5c TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:23 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.219) 0:03:00.370 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:32 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.022) 0:03:00.393 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:45 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.021) 0:03:00.415 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:51 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.020) 0:03:00.436 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:56 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.019) 0:03:00.455 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:69 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.015) 0:03:00.471 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:81 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.015) 0:03:00.486 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:94 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.017) 0:03:00.503 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9 /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:106 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.019) 0:03:00.523 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:114 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.018) 0:03:00.541 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:122 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.020) 0:03:00.562 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:131 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.020) 0:03:00.583 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:140 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.020) 0:03:00.604 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:8 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.016) 0:03:00.620 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:14 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.015) 0:03:00.636 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:21 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.017) 0:03:00.653 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:28 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.016) 0:03:00.670 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:35 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.015) 0:03:00.685 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:45 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.014) 0:03:00.700 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:54 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.016) 0:03:00.716 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:63 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.015) 0:03:00.731 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:72 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.015) 0:03:00.747 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:81 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.014) 0:03:00.762 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:3 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.014) 0:03:00.777 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:11 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.018) 0:03:00.795 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:20 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.017) 0:03:00.813 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:28 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.017) 0:03:00.830 ******* ok: [sut] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:32 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.016) 0:03:00.846 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:46 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.016) 0:03:00.862 ******* skipping: [sut] => {} TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:50 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.015) 0:03:00.878 ******* skipping: [sut] => {} TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:54 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.015) 0:03:00.894 ******* skipping: [sut] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:58 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.017) 0:03:00.912 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:68 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.015) 0:03:00.927 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:72 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.014) 0:03:00.942 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:77 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.014) 0:03:00.957 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:83 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.014) 0:03:00.971 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:88 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.014) 0:03:00.986 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:96 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.016) 0:03:01.002 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:104 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.014) 0:03:01.017 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:109 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.014) 0:03:01.031 ******* skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:113 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.014) 0:03:01.045 ******* skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:117 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.014) 0:03:01.060 ******* skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:121 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.013) 0:03:01.074 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:129 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.015) 0:03:01.090 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:138 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.014) 0:03:01.104 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:142 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.014) 0:03:01.119 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:150 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.014) 0:03:01.134 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:156 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.014) 0:03:01.148 ******* ok: [sut] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:160 Sunday 03 December 2023 04:33:34 +0000 (0:00:00.016) 0:03:01.165 ******* ok: [sut] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:164 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.047) 0:03:01.212 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:5 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.018) 0:03:01.231 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:13 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.016) 0:03:01.247 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:18 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.017) 0:03:01.265 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:27 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.016) 0:03:01.282 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:35 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.016) 0:03:01.298 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:41 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.023) 0:03:01.322 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:47 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.017) 0:03:01.340 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:27 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.017) 0:03:01.357 ******* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:44 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.016) 0:03:01.373 ******* TASK [Clean up variable namespace] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:54 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.014) 0:03:01.388 ******* ok: [sut] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/create-test-file.yml:12 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.019) 0:03:01.407 ******* changed: [sut] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:222 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.217) 0:03:01.625 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml for sut TASK [Store global variable value copy] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:4 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.040) 0:03:01.665 ******* ok: [sut] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:10 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.019) 0:03:01.685 ******* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.022) 0:03:01.708 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.024) 0:03:01.732 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.017) 0:03:01.750 ******* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.044) 0:03:01.794 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.016) 0:03:01.810 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.020) 0:03:01.831 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.019) 0:03:01.850 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.017) 0:03:01.867 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 03 December 2023 04:33:35 +0000 (0:00:00.035) 0:03:01.903 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 03 December 2023 04:33:38 +0000 (0:00:02.597) 0:03:04.500 ******* ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 03 December 2023 04:33:38 +0000 (0:00:00.021) 0:03:04.521 ******* ok: [sut] => { "storage_volumes": [] } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 03 December 2023 04:33:38 +0000 (0:00:00.019) 0:03:04.541 ******* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 03 December 2023 04:33:40 +0000 (0:00:01.893) 0:03:06.434 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 03 December 2023 04:33:40 +0000 (0:00:00.030) 0:03:06.464 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 03 December 2023 04:33:40 +0000 (0:00:00.025) 0:03:06.490 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 03 December 2023 04:33:40 +0000 (0:00:00.015) 0:03:06.505 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 03 December 2023 04:33:40 +0000 (0:00:00.023) 0:03:06.529 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 03 December 2023 04:33:42 +0000 (0:00:02.619) 0:03:09.149 ******* ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d4e2e6a0c\\x2dfae7\\x2d4be2\\x2da39c\\x2d6b04d956a7ae.service": { "name": "systemd-cryptsetup@luks\\x2d4e2e6a0c\\x2dfae7\\x2d4be2\\x2da39c\\x2d6b04d956a7ae.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 03 December 2023 04:33:45 +0000 (0:00:02.096) 0:03:11.245 ******* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d4e2e6a0c\\x2dfae7\\x2d4be2\\x2da39c\\x2d6b04d956a7ae.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 03 December 2023 04:33:45 +0000 (0:00:00.027) 0:03:11.272 ******* changed: [sut] => (item=systemd-cryptsetup@luks\x2d4e2e6a0c\x2dfae7\x2d4be2\x2da39c\x2d6b04d956a7ae.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4e2e6a0c\\x2dfae7\\x2d4be2\\x2da39c\\x2d6b04d956a7ae.service", "name": "systemd-cryptsetup@luks\\x2d4e2e6a0c\\x2dfae7\\x2d4be2\\x2da39c\\x2d6b04d956a7ae.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target systemd-journald.socket \"system-systemd\\\\x2dcryptsetup.slice\" systemd-udevd-kernel.socket dev-sda.device", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target \"blockdev@dev-mapper-luks\\\\x2d4e2e6a0c\\\\x2dfae7\\\\x2d4be2\\\\x2da39c\\\\x2d6b04d956a7ae.target\" umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae /dev/sda - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-4e2e6a0c-fae7-4be2-a39c-6b04d956a7ae ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d4e2e6a0c\\x2dfae7\\x2d4be2\\x2da39c\\x2d6b04d956a7ae.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d4e2e6a0c\\x2dfae7\\x2d4be2\\x2da39c\\x2d6b04d956a7ae.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "14732", "LimitNPROCSoft": "14732", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14732", "LimitSIGPENDINGSoft": "14732", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d4e2e6a0c\\\\x2dfae7\\\\x2d4be2\\\\x2da39c\\\\x2d6b04d956a7ae.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2023-12-03 04:33:27 UTC", "StateChangeTimestampMonotonic": "923208998", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "4419", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d4e2e6a0c\\\\x2dfae7\\\\x2d4be2\\\\x2da39c\\\\x2d6b04d956a7ae.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 03 December 2023 04:33:45 +0000 (0:00:00.825) 0:03:12.097 ******* fatal: [sut]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9' in safe mode due to encryption removal TASK [linux-system-roles.storage : Failed message] ***************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:106 Sunday 03 December 2023 04:33:47 +0000 (0:00:01.919) 0:03:14.017 ******* fatal: [sut]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'yabbadabbadoo', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 03 December 2023 04:33:47 +0000 (0:00:00.026) 0:03:14.043 ******* changed: [sut] => (item=systemd-cryptsetup@luks\x2d4e2e6a0c\x2dfae7\x2d4be2\x2da39c\x2d6b04d956a7ae.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4e2e6a0c\\x2dfae7\\x2d4be2\\x2da39c\\x2d6b04d956a7ae.service", "name": "systemd-cryptsetup@luks\\x2d4e2e6a0c\\x2dfae7\\x2d4be2\\x2da39c\\x2d6b04d956a7ae.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d4e2e6a0c\\x2dfae7\\x2d4be2\\x2da39c\\x2d6b04d956a7ae.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d4e2e6a0c\\x2dfae7\\x2d4be2\\x2da39c\\x2d6b04d956a7ae.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d4e2e6a0c\\x2dfae7\\x2d4be2\\x2da39c\\x2d6b04d956a7ae.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "486428672", "LimitMEMLOCKSoft": "486428672", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "14732", "LimitNPROCSoft": "14732", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14732", "LimitSIGPENDINGSoft": "14732", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d4e2e6a0c\\x2dfae7\\x2d4be2\\x2da39c\\x2d6b04d956a7ae.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d4e2e6a0c\\\\x2dfae7\\\\x2d4be2\\\\x2da39c\\\\x2d6b04d956a7ae.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "4419", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:29 Sunday 03 December 2023 04:33:48 +0000 (0:00:00.824) 0:03:14.867 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:34 Sunday 03 December 2023 04:33:48 +0000 (0:00:00.017) 0:03:14.885 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:45 Sunday 03 December 2023 04:33:48 +0000 (0:00:00.022) 0:03:14.908 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-data-preservation.yml:11 Sunday 03 December 2023 04:33:48 +0000 (0:00:00.014) 0:03:14.922 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578015.3991807, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1701578015.3991807, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1701578015.3991807, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2476876624", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-data-preservation.yml:16 Sunday 03 December 2023 04:33:48 +0000 (0:00:00.212) 0:03:15.135 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:246 Sunday 03 December 2023 04:33:48 +0000 (0:00:00.018) 0:03:15.154 ******* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 03 December 2023 04:33:49 +0000 (0:00:00.060) 0:03:15.214 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 03 December 2023 04:33:49 +0000 (0:00:00.023) 0:03:15.238 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 03 December 2023 04:33:49 +0000 (0:00:00.018) 0:03:15.256 ******* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 03 December 2023 04:33:49 +0000 (0:00:00.039) 0:03:15.295 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 03 December 2023 04:33:49 +0000 (0:00:00.014) 0:03:15.310 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 03 December 2023 04:33:49 +0000 (0:00:00.015) 0:03:15.325 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 03 December 2023 04:33:49 +0000 (0:00:00.014) 0:03:15.339 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 03 December 2023 04:33:49 +0000 (0:00:00.013) 0:03:15.353 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 03 December 2023 04:33:49 +0000 (0:00:00.029) 0:03:15.383 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 03 December 2023 04:33:51 +0000 (0:00:02.637) 0:03:18.021 ******* ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 03 December 2023 04:33:51 +0000 (0:00:00.022) 0:03:18.043 ******* ok: [sut] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 03 December 2023 04:33:51 +0000 (0:00:00.019) 0:03:18.063 ******* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 03 December 2023 04:33:53 +0000 (0:00:01.872) 0:03:19.935 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 03 December 2023 04:33:53 +0000 (0:00:00.028) 0:03:19.963 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 03 December 2023 04:33:53 +0000 (0:00:00.026) 0:03:19.990 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 03 December 2023 04:33:53 +0000 (0:00:00.017) 0:03:20.007 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 03 December 2023 04:33:53 +0000 (0:00:00.025) 0:03:20.032 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 03 December 2023 04:33:56 +0000 (0:00:02.618) 0:03:22.651 ******* ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service": { "name": "systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 03 December 2023 04:33:58 +0000 (0:00:02.046) 0:03:24.698 ******* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 03 December 2023 04:33:58 +0000 (0:00:00.025) 0:03:24.723 ******* changed: [sut] => (item=systemd-cryptsetup@luks\x2ddaa28e2b\x2de938\x2d4a40\x2db4b9\x2ddf3ad165c5d9.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service", "name": "systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-udevd-kernel.socket systemd-journald.socket dev-sda1.device \"system-systemd\\\\x2dcryptsetup.slice\" cryptsetup-pre.target", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target \"blockdev@dev-mapper-luks\\\\x2ddaa28e2b\\\\x2de938\\\\x2d4a40\\\\x2db4b9\\\\x2ddf3ad165c5d9.target\" cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9 /dev/sda1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "14732", "LimitNPROCSoft": "14732", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14732", "LimitSIGPENDINGSoft": "14732", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2ddaa28e2b\\\\x2de938\\\\x2d4a40\\\\x2db4b9\\\\x2ddf3ad165c5d9.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "\"dev-mapper-luks\\\\x2ddaa28e2b\\\\x2de938\\\\x2d4a40\\\\x2db4b9\\\\x2ddf3ad165c5d9.device\" cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2023-12-03 04:33:48 UTC", "StateChangeTimestampMonotonic": "944607153", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "4419", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2ddaa28e2b\\\\x2de938\\\\x2d4a40\\\\x2db4b9\\\\x2ddf3ad165c5d9.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 03 December 2023 04:33:59 +0000 (0:00:00.822) 0:03:25.546 ******* changed: [sut] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8", "state": "mounted" } ], "packages": [ "e2fsprogs", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:98 Sunday 03 December 2023 04:34:01 +0000 (0:00:02.610) 0:03:28.157 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 03 December 2023 04:34:01 +0000 (0:00:00.015) 0:03:28.172 ******* changed: [sut] => (item=systemd-cryptsetup@luks\x2ddaa28e2b\x2de938\x2d4a40\x2db4b9\x2ddf3ad165c5d9.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service", "name": "systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "486428672", "LimitMEMLOCKSoft": "486428672", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "14732", "LimitNPROCSoft": "14732", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14732", "LimitSIGPENDINGSoft": "14732", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2ddaa28e2b\\\\x2de938\\\\x2d4a40\\\\x2db4b9\\\\x2ddf3ad165c5d9.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2023-12-03 04:33:48 UTC", "StateChangeTimestampMonotonic": "944607153", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "4419", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:116 Sunday 03 December 2023 04:34:02 +0000 (0:00:00.841) 0:03:29.013 ******* ok: [sut] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8", "state": "mounted" } ], "packages": [ "e2fsprogs", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:121 Sunday 03 December 2023 04:34:02 +0000 (0:00:00.021) 0:03:29.034 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:125 Sunday 03 December 2023 04:34:02 +0000 (0:00:00.018) 0:03:29.053 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:141 Sunday 03 December 2023 04:34:02 +0000 (0:00:00.017) 0:03:29.070 ******* changed: [sut] => (item={'src': '/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:153 Sunday 03 December 2023 04:34:03 +0000 (0:00:00.227) 0:03:29.298 ******* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:158 Sunday 03 December 2023 04:34:03 +0000 (0:00:00.794) 0:03:30.092 ******* changed: [sut] => (item={'src': 'UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8" } TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:170 Sunday 03 December 2023 04:34:04 +0000 (0:00:00.268) 0:03:30.361 ******* skipping: [sut] => (item={'src': 'UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:185 Sunday 03 December 2023 04:34:04 +0000 (0:00:00.022) 0:03:30.383 ******* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:193 Sunday 03 December 2023 04:34:04 +0000 (0:00:00.788) 0:03:31.171 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578007.8982263, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "089f0d7d88ce2ad870373a70e8d488a19272e984", "ctime": 1701578007.8972263, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263373, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1701578007.8962264, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "2908715340", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Sunday 03 December 2023 04:34:05 +0000 (0:00:00.225) 0:03:31.397 ******* changed: [sut] => (item={'backing_device': '/dev/sda1', 'name': 'luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:220 Sunday 03 December 2023 04:34:05 +0000 (0:00:00.242) 0:03:31.639 ******* ok: [sut] TASK [Verify role results] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:263 Sunday 03 December 2023 04:34:06 +0000 (0:00:00.825) 0:03:32.464 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml for sut TASK [Print out pool information] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:2 Sunday 03 December 2023 04:34:06 +0000 (0:00:00.045) 0:03:32.509 ******* ok: [sut] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:7 Sunday 03 December 2023 04:34:06 +0000 (0:00:00.022) 0:03:32.532 ******* skipping: [sut] => {} TASK [Collect info about the volumes.] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:15 Sunday 03 December 2023 04:34:06 +0000 (0:00:00.015) 0:03:32.547 ******* ok: [sut] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "e02437b8-8864-44ba-8d2c-3c557f0b87e8" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "e7a9b339-c47b-44f9-a3db-5567a302dcca" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:20 Sunday 03 December 2023 04:34:06 +0000 (0:00:00.214) 0:03:32.762 ******* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003373", "end": "2023-12-03 04:34:06.747433", "rc": 0, "start": "2023-12-03 04:34:06.744060" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Oct 10 09:41:04 2023 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=e7a9b339-c47b-44f9-a3db-5567a302dcca / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:25 Sunday 03 December 2023 04:34:06 +0000 (0:00:00.209) 0:03:32.972 ******* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003383", "end": "2023-12-03 04:34:06.957970", "failed_when_result": false, "rc": 0, "start": "2023-12-03 04:34:06.954587" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:34 Sunday 03 December 2023 04:34:06 +0000 (0:00:00.211) 0:03:33.184 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool.yml for sut TASK [Set _storage_pool_tests] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool.yml:5 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.031) 0:03:33.216 ******* ok: [sut] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Verify pool subset] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool.yml:18 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.014) 0:03:33.230 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-volumes.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:2 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.031) 0:03:33.262 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:13 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.015) 0:03:33.278 ******* TASK [Set pvs lvm length] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:22 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.013) 0:03:33.291 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:27 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.014) 0:03:33.305 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:33 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.016) 0:03:33.321 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:42 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.016) 0:03:33.337 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:48 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.016) 0:03:33.354 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:54 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.018) 0:03:33.373 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:59 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.016) 0:03:33.389 ******* TASK [Check MD RAID] *********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:73 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.014) 0:03:33.404 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml for sut TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:8 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.031) 0:03:33.435 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:14 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.045) 0:03:33.481 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:21 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.016) 0:03:33.498 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:28 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.015) 0:03:33.513 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:35 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.017) 0:03:33.531 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:45 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.015) 0:03:33.546 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:54 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.014) 0:03:33.560 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:64 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.016) 0:03:33.577 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:74 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.016) 0:03:33.593 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:85 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.017) 0:03:33.610 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:95 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.016) 0:03:33.627 ******* ok: [sut] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:76 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.015) 0:03:33.642 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-lvmraid.yml for sut TASK [Validate pool member LVM RAID settings] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-lvmraid.yml:2 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.031) 0:03:33.674 ******* skipping: [sut] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'yabbadabbadoo', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:79 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.021) 0:03:33.695 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-thin.yml for sut TASK [Validate pool member thinpool settings] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-thin.yml:2 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.030) 0:03:33.725 ******* skipping: [sut] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'yabbadabbadoo', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:82 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.020) 0:03:33.745 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:5 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.034) 0:03:33.780 ******* ok: [sut] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:13 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.017) 0:03:33.798 ******* TASK [Validate pool member crypttab entries] *********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:20 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.012) 0:03:33.810 ******* TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:27 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.012) 0:03:33.823 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:85 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.015) 0:03:33.838 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-vdo.yml for sut TASK [Validate pool member VDO settings] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-vdo.yml:2 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.032) 0:03:33.871 ******* skipping: [sut] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'yabbadabbadoo', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Clean up test variables] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:88 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.019) 0:03:33.891 ******* ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-volumes.yml:3 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.014) 0:03:33.905 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml for sut TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:2 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.032) 0:03:33.938 ******* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:21 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.021) 0:03:33.960 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:7 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.083) 0:03:34.043 ******* ok: [sut] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:16 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.019) 0:03:34.063 ******* ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2576597, "block_size": 4096, "block_total": 2603008, "block_used": 26411, "device": "/dev/sda1", "fstype": "xfs", "inode_available": 5238781, "inode_total": 5238784, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10553741312, "size_total": 10661920768, "uuid": "e02437b8-8864-44ba-8d2c-3c557f0b87e8" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2576597, "block_size": 4096, "block_total": 2603008, "block_used": 26411, "device": "/dev/sda1", "fstype": "xfs", "inode_available": 5238781, "inode_total": 5238784, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10553741312, "size_total": 10661920768, "uuid": "e02437b8-8864-44ba-8d2c-3c557f0b87e8" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:38 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.023) 0:03:34.087 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:51 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.015) 0:03:34.102 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:63 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.019) 0:03:34.121 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:71 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.019) 0:03:34.141 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:83 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.018) 0:03:34.159 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:95 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.016) 0:03:34.176 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:110 Sunday 03 December 2023 04:34:07 +0000 (0:00:00.016) 0:03:34.193 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:122 Sunday 03 December 2023 04:34:08 +0000 (0:00:00.019) 0:03:34.212 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:128 Sunday 03 December 2023 04:34:08 +0000 (0:00:00.014) 0:03:34.227 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:134 Sunday 03 December 2023 04:34:08 +0000 (0:00:00.014) 0:03:34.242 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:146 Sunday 03 December 2023 04:34:08 +0000 (0:00:00.016) 0:03:34.258 ******* ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:2 Sunday 03 December 2023 04:34:08 +0000 (0:00:00.014) 0:03:34.273 ******* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:40 Sunday 03 December 2023 04:34:08 +0000 (0:00:00.034) 0:03:34.307 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:48 Sunday 03 December 2023 04:34:08 +0000 (0:00:00.019) 0:03:34.327 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:58 Sunday 03 December 2023 04:34:08 +0000 (0:00:00.019) 0:03:34.346 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:71 Sunday 03 December 2023 04:34:08 +0000 (0:00:00.014) 0:03:34.361 ******* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml:3 Sunday 03 December 2023 04:34:08 +0000 (0:00:00.016) 0:03:34.377 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml:12 Sunday 03 December 2023 04:34:08 +0000 (0:00:00.020) 0:03:34.398 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:3 Sunday 03 December 2023 04:34:08 +0000 (0:00:00.020) 0:03:34.419 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578045.3959968, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1701578041.8500185, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 996, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1701578041.8500185, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:9 Sunday 03 December 2023 04:34:08 +0000 (0:00:00.245) 0:03:34.664 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:16 Sunday 03 December 2023 04:34:08 +0000 (0:00:00.021) 0:03:34.685 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:24 Sunday 03 December 2023 04:34:08 +0000 (0:00:00.016) 0:03:34.702 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:30 Sunday 03 December 2023 04:34:08 +0000 (0:00:00.018) 0:03:34.720 ******* ok: [sut] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:34 Sunday 03 December 2023 04:34:08 +0000 (0:00:00.017) 0:03:34.737 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:39 Sunday 03 December 2023 04:34:08 +0000 (0:00:00.015) 0:03:34.753 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:3 Sunday 03 December 2023 04:34:08 +0000 (0:00:00.020) 0:03:34.773 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:10 Sunday 03 December 2023 04:34:08 +0000 (0:00:00.014) 0:03:34.788 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:17 Sunday 03 December 2023 04:34:11 +0000 (0:00:02.600) 0:03:37.388 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:23 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.017) 0:03:37.406 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:32 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.017) 0:03:37.424 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:45 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.024) 0:03:37.448 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:51 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.017) 0:03:37.465 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:56 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.016) 0:03:37.482 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:69 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.014) 0:03:37.497 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:81 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.015) 0:03:37.512 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:94 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.014) 0:03:37.527 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:106 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.020) 0:03:37.547 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:114 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.018) 0:03:37.566 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:122 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.016) 0:03:37.582 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:131 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.014) 0:03:37.597 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:140 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.015) 0:03:37.612 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:8 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.014) 0:03:37.626 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:14 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.014) 0:03:37.641 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:21 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.016) 0:03:37.658 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:28 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.022) 0:03:37.681 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:35 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.018) 0:03:37.699 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:45 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.016) 0:03:37.716 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:54 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.015) 0:03:37.732 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:63 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.015) 0:03:37.747 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:72 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.016) 0:03:37.763 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:81 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.018) 0:03:37.782 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:3 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.015) 0:03:37.797 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:11 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.017) 0:03:37.815 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:20 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.017) 0:03:37.833 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:28 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.018) 0:03:37.852 ******* ok: [sut] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:32 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.018) 0:03:37.870 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:46 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.022) 0:03:37.893 ******* skipping: [sut] => {} TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:50 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.018) 0:03:37.912 ******* skipping: [sut] => {} TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:54 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.018) 0:03:37.930 ******* skipping: [sut] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:58 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.017) 0:03:37.947 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:68 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.017) 0:03:37.965 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:72 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.015) 0:03:37.980 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:77 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.018) 0:03:37.998 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:83 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.015) 0:03:38.014 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:88 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.015) 0:03:38.030 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:96 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.015) 0:03:38.045 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:104 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.014) 0:03:38.060 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:109 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.014) 0:03:38.075 ******* skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:113 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.016) 0:03:38.092 ******* skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:117 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.015) 0:03:38.107 ******* skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:121 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.015) 0:03:38.122 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:129 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.015) 0:03:38.137 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:138 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.014) 0:03:38.152 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:142 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.014) 0:03:38.167 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:150 Sunday 03 December 2023 04:34:11 +0000 (0:00:00.016) 0:03:38.183 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:156 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.014) 0:03:38.198 ******* ok: [sut] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:160 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.017) 0:03:38.216 ******* ok: [sut] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:164 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.016) 0:03:38.232 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:5 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.015) 0:03:38.248 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:13 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.014) 0:03:38.262 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:18 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.016) 0:03:38.279 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:27 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.016) 0:03:38.295 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:35 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.015) 0:03:38.310 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:41 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.014) 0:03:38.325 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:47 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.015) 0:03:38.340 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:27 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.014) 0:03:38.355 ******* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:44 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.016) 0:03:38.371 ******* TASK [Clean up variable namespace] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:54 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.013) 0:03:38.384 ******* ok: [sut] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/create-test-file.yml:12 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.014) 0:03:38.399 ******* changed: [sut] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:269 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.218) 0:03:38.618 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml for sut TASK [Store global variable value copy] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:4 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.044) 0:03:38.662 ******* ok: [sut] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:10 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.055) 0:03:38.718 ******* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.023) 0:03:38.742 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.025) 0:03:38.767 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.018) 0:03:38.786 ******* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.041) 0:03:38.827 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.015) 0:03:38.843 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.015) 0:03:38.859 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.014) 0:03:38.873 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.015) 0:03:38.889 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 03 December 2023 04:34:12 +0000 (0:00:00.032) 0:03:38.922 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 03 December 2023 04:34:15 +0000 (0:00:02.634) 0:03:41.556 ******* ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 03 December 2023 04:34:15 +0000 (0:00:00.021) 0:03:41.577 ******* ok: [sut] => { "storage_volumes": [] } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 03 December 2023 04:34:15 +0000 (0:00:00.019) 0:03:41.597 ******* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 03 December 2023 04:34:17 +0000 (0:00:01.751) 0:03:43.349 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 03 December 2023 04:34:17 +0000 (0:00:00.030) 0:03:43.380 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 03 December 2023 04:34:17 +0000 (0:00:00.024) 0:03:43.404 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 03 December 2023 04:34:17 +0000 (0:00:00.016) 0:03:43.421 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 03 December 2023 04:34:17 +0000 (0:00:00.023) 0:03:43.445 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 03 December 2023 04:34:19 +0000 (0:00:02.639) 0:03:46.084 ******* ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service": { "name": "systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 03 December 2023 04:34:21 +0000 (0:00:02.086) 0:03:48.170 ******* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 03 December 2023 04:34:21 +0000 (0:00:00.025) 0:03:48.195 ******* changed: [sut] => (item=systemd-cryptsetup@luks\x2ddaa28e2b\x2de938\x2d4a40\x2db4b9\x2ddf3ad165c5d9.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service", "name": "systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "\"system-systemd\\\\x2dcryptsetup.slice\" cryptsetup-pre.target dev-sda1.device systemd-udevd-kernel.socket systemd-journald.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target \"blockdev@dev-mapper-luks\\\\x2ddaa28e2b\\\\x2de938\\\\x2d4a40\\\\x2db4b9\\\\x2ddf3ad165c5d9.target\"", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9 /dev/sda1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-daa28e2b-e938-4a40-b4b9-df3ad165c5d9 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "14732", "LimitNPROCSoft": "14732", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14732", "LimitSIGPENDINGSoft": "14732", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2ddaa28e2b\\\\x2de938\\\\x2d4a40\\\\x2db4b9\\\\x2ddf3ad165c5d9.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2023-12-03 04:33:48 UTC", "StateChangeTimestampMonotonic": "944607153", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "4419", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2ddaa28e2b\\\\x2de938\\\\x2d4a40\\\\x2db4b9\\\\x2ddf3ad165c5d9.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 03 December 2023 04:34:22 +0000 (0:00:00.826) 0:03:49.022 ******* fatal: [sut]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [linux-system-roles.storage : Failed message] ***************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:106 Sunday 03 December 2023 04:34:24 +0000 (0:00:01.771) 0:03:50.794 ******* fatal: [sut]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'yabbadabbadoo', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 03 December 2023 04:34:24 +0000 (0:00:00.021) 0:03:50.816 ******* changed: [sut] => (item=systemd-cryptsetup@luks\x2ddaa28e2b\x2de938\x2d4a40\x2db4b9\x2ddf3ad165c5d9.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service", "name": "systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "486428672", "LimitMEMLOCKSoft": "486428672", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "14732", "LimitNPROCSoft": "14732", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14732", "LimitSIGPENDINGSoft": "14732", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2ddaa28e2b\\x2de938\\x2d4a40\\x2db4b9\\x2ddf3ad165c5d9.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2ddaa28e2b\\\\x2de938\\\\x2d4a40\\\\x2db4b9\\\\x2ddf3ad165c5d9.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "4419", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:29 Sunday 03 December 2023 04:34:25 +0000 (0:00:00.832) 0:03:51.648 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:34 Sunday 03 December 2023 04:34:25 +0000 (0:00:00.020) 0:03:51.669 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:45 Sunday 03 December 2023 04:34:25 +0000 (0:00:00.026) 0:03:51.695 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-data-preservation.yml:11 Sunday 03 December 2023 04:34:25 +0000 (0:00:00.017) 0:03:51.713 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578052.391954, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1701578052.391954, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1701578052.391954, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1036282882", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-data-preservation.yml:16 Sunday 03 December 2023 04:34:25 +0000 (0:00:00.222) 0:03:51.935 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:295 Sunday 03 December 2023 04:34:25 +0000 (0:00:00.019) 0:03:51.955 ******* ok: [sut] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testp2iy_xrelukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:302 Sunday 03 December 2023 04:34:26 +0000 (0:00:00.311) 0:03:52.266 ******* ok: [sut] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testp2iy_xrelukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1701578066.1056817-112090-153158391929788/source", "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:309 Sunday 03 December 2023 04:34:26 +0000 (0:00:00.722) 0:03:52.988 ******* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 03 December 2023 04:34:26 +0000 (0:00:00.027) 0:03:53.016 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 03 December 2023 04:34:26 +0000 (0:00:00.024) 0:03:53.041 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 03 December 2023 04:34:26 +0000 (0:00:00.020) 0:03:53.061 ******* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 03 December 2023 04:34:26 +0000 (0:00:00.043) 0:03:53.104 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 03 December 2023 04:34:26 +0000 (0:00:00.020) 0:03:53.124 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 03 December 2023 04:34:26 +0000 (0:00:00.017) 0:03:53.142 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 03 December 2023 04:34:26 +0000 (0:00:00.017) 0:03:53.159 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 03 December 2023 04:34:26 +0000 (0:00:00.017) 0:03:53.177 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 03 December 2023 04:34:27 +0000 (0:00:00.036) 0:03:53.213 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 03 December 2023 04:34:29 +0000 (0:00:02.607) 0:03:55.821 ******* ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_testp2iy_xrelukskey", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 03 December 2023 04:34:29 +0000 (0:00:00.020) 0:03:55.842 ******* ok: [sut] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 03 December 2023 04:34:29 +0000 (0:00:00.016) 0:03:55.858 ******* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 03 December 2023 04:34:31 +0000 (0:00:01.755) 0:03:57.613 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 03 December 2023 04:34:31 +0000 (0:00:00.033) 0:03:57.646 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 03 December 2023 04:34:31 +0000 (0:00:00.028) 0:03:57.674 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 03 December 2023 04:34:31 +0000 (0:00:00.016) 0:03:57.691 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 03 December 2023 04:34:31 +0000 (0:00:00.026) 0:03:57.718 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 03 December 2023 04:34:34 +0000 (0:00:02.617) 0:04:00.335 ******* ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 03 December 2023 04:34:36 +0000 (0:00:02.065) 0:04:02.400 ******* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 03 December 2023 04:34:36 +0000 (0:00:00.025) 0:04:02.426 ******* TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 03 December 2023 04:34:36 +0000 (0:00:00.015) 0:04:02.441 ******* changed: [sut] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "password": "/tmp/storage_testp2iy_xrelukskey", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testp2iy_xrelukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:98 Sunday 03 December 2023 04:34:47 +0000 (0:00:11.033) 0:04:13.475 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 03 December 2023 04:34:47 +0000 (0:00:00.016) 0:04:13.492 ******* TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:116 Sunday 03 December 2023 04:34:47 +0000 (0:00:00.015) 0:04:13.507 ******* ok: [sut] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "password": "/tmp/storage_testp2iy_xrelukskey", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testp2iy_xrelukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:121 Sunday 03 December 2023 04:34:47 +0000 (0:00:00.021) 0:04:13.528 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testp2iy_xrelukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:125 Sunday 03 December 2023 04:34:47 +0000 (0:00:00.021) 0:04:13.550 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:141 Sunday 03 December 2023 04:34:47 +0000 (0:00:00.019) 0:04:13.569 ******* changed: [sut] => (item={'src': 'UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=e02437b8-8864-44ba-8d2c-3c557f0b87e8" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:153 Sunday 03 December 2023 04:34:47 +0000 (0:00:00.228) 0:04:13.798 ******* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:158 Sunday 03 December 2023 04:34:48 +0000 (0:00:00.796) 0:04:14.594 ******* changed: [sut] => (item={'src': '/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d" } TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:170 Sunday 03 December 2023 04:34:48 +0000 (0:00:00.254) 0:04:14.848 ******* skipping: [sut] => (item={'src': '/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:185 Sunday 03 December 2023 04:34:48 +0000 (0:00:00.025) 0:04:14.874 ******* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:193 Sunday 03 December 2023 04:34:49 +0000 (0:00:00.799) 0:04:15.673 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578046.9569871, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1701578045.3939967, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263374, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1701578045.3919969, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3242111922", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Sunday 03 December 2023 04:34:49 +0000 (0:00:00.221) 0:04:15.895 ******* changed: [sut] => (item={'backing_device': '/dev/sda1', 'name': 'luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d', 'password': '/tmp/storage_testp2iy_xrelukskey', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "password": "/tmp/storage_testp2iy_xrelukskey", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:220 Sunday 03 December 2023 04:34:49 +0000 (0:00:00.241) 0:04:16.137 ******* ok: [sut] TASK [Verify role results] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:326 Sunday 03 December 2023 04:34:50 +0000 (0:00:00.809) 0:04:16.946 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml for sut TASK [Print out pool information] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:2 Sunday 03 December 2023 04:34:50 +0000 (0:00:00.034) 0:04:16.981 ******* ok: [sut] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testp2iy_xrelukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:7 Sunday 03 December 2023 04:34:50 +0000 (0:00:00.027) 0:04:17.008 ******* skipping: [sut] => {} TASK [Collect info about the volumes.] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:15 Sunday 03 December 2023 04:34:50 +0000 (0:00:00.016) 0:04:17.024 ******* ok: [sut] => { "changed": false, "info": { "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "size": "10G", "type": "crypt", "uuid": "31ec7184-423b-4708-8f4d-7b9b661749f2" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "e7a9b339-c47b-44f9-a3db-5567a302dcca" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:20 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.220) 0:04:17.245 ******* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003252", "end": "2023-12-03 04:34:51.229973", "rc": 0, "start": "2023-12-03 04:34:51.226721" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Oct 10 09:41:04 2023 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=e7a9b339-c47b-44f9-a3db-5567a302dcca / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:25 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.210) 0:04:17.455 ******* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003487", "end": "2023-12-03 04:34:51.438944", "failed_when_result": false, "rc": 0, "start": "2023-12-03 04:34:51.435457" } STDOUT: luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d /dev/sda1 /tmp/storage_testp2iy_xrelukskey TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:34 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.209) 0:04:17.665 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool.yml for sut TASK [Set _storage_pool_tests] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool.yml:5 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.033) 0:04:17.699 ******* ok: [sut] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Verify pool subset] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool.yml:18 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.015) 0:04:17.714 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-volumes.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:2 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.037) 0:04:17.751 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:13 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.016) 0:04:17.768 ******* TASK [Set pvs lvm length] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:22 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.013) 0:04:17.782 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:27 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.014) 0:04:17.796 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:33 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.017) 0:04:17.814 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:42 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.015) 0:04:17.829 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:48 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.015) 0:04:17.845 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:54 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.015) 0:04:17.860 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:59 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.014) 0:04:17.875 ******* TASK [Check MD RAID] *********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:73 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.017) 0:04:17.893 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml for sut TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:8 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.039) 0:04:17.933 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:14 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.017) 0:04:17.950 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:21 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.017) 0:04:17.967 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:28 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.016) 0:04:17.984 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:35 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.049) 0:04:18.033 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:45 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.017) 0:04:18.050 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:54 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.016) 0:04:18.067 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:64 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.020) 0:04:18.088 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:74 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.021) 0:04:18.109 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:85 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.017) 0:04:18.127 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:95 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.020) 0:04:18.147 ******* ok: [sut] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:76 Sunday 03 December 2023 04:34:51 +0000 (0:00:00.017) 0:04:18.164 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-lvmraid.yml for sut TASK [Validate pool member LVM RAID settings] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-lvmraid.yml:2 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.036) 0:04:18.201 ******* skipping: [sut] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testp2iy_xrelukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testp2iy_xrelukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:79 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.022) 0:04:18.223 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-thin.yml for sut TASK [Validate pool member thinpool settings] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-thin.yml:2 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.032) 0:04:18.256 ******* skipping: [sut] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testp2iy_xrelukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testp2iy_xrelukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:82 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.021) 0:04:18.278 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:5 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.035) 0:04:18.313 ******* ok: [sut] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:13 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.022) 0:04:18.336 ******* TASK [Validate pool member crypttab entries] *********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:20 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.016) 0:04:18.353 ******* TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:27 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.015) 0:04:18.369 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:85 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.015) 0:04:18.384 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-vdo.yml for sut TASK [Validate pool member VDO settings] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-vdo.yml:2 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.038) 0:04:18.423 ******* skipping: [sut] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testp2iy_xrelukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testp2iy_xrelukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Clean up test variables] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:88 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.022) 0:04:18.445 ******* ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-volumes.yml:3 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.017) 0:04:18.463 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml for sut TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:2 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.034) 0:04:18.497 ******* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:21 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.022) 0:04:18.520 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:7 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.077) 0:04:18.598 ******* ok: [sut] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:16 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.018) 0:04:18.617 ******* ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2572529, "block_size": 4096, "block_total": 2598912, "block_used": 26383, "device": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "fstype": "xfs", "inode_available": 5230589, "inode_total": 5230592, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10537078784, "size_total": 10645143552, "uuid": "31ec7184-423b-4708-8f4d-7b9b661749f2" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2572529, "block_size": 4096, "block_total": 2598912, "block_used": 26383, "device": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "fstype": "xfs", "inode_available": 5230589, "inode_total": 5230592, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10537078784, "size_total": 10645143552, "uuid": "31ec7184-423b-4708-8f4d-7b9b661749f2" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:38 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.027) 0:04:18.645 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:51 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.016) 0:04:18.661 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:63 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.019) 0:04:18.681 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:71 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.017) 0:04:18.699 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:83 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.014) 0:04:18.714 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:95 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.015) 0:04:18.730 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:110 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.015) 0:04:18.745 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:122 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.019) 0:04:18.764 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:128 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.014) 0:04:18.778 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:134 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.014) 0:04:18.793 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:146 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.014) 0:04:18.807 ******* ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:2 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.016) 0:04:18.823 ******* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:40 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.034) 0:04:18.857 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:48 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.019) 0:04:18.877 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:58 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.019) 0:04:18.897 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:71 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.017) 0:04:18.915 ******* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml:3 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.015) 0:04:18.931 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml:12 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.024) 0:04:18.955 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:3 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.020) 0:04:18.976 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578089.898724, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1701578086.9127424, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1314, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1701578086.9127424, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:9 Sunday 03 December 2023 04:34:52 +0000 (0:00:00.213) 0:04:19.190 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:16 Sunday 03 December 2023 04:34:53 +0000 (0:00:00.052) 0:04:19.242 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:24 Sunday 03 December 2023 04:34:53 +0000 (0:00:00.017) 0:04:19.259 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:30 Sunday 03 December 2023 04:34:53 +0000 (0:00:00.019) 0:04:19.279 ******* ok: [sut] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:34 Sunday 03 December 2023 04:34:53 +0000 (0:00:00.018) 0:04:19.298 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:39 Sunday 03 December 2023 04:34:53 +0000 (0:00:00.016) 0:04:19.315 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:3 Sunday 03 December 2023 04:34:53 +0000 (0:00:00.020) 0:04:19.335 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578089.902724, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1701578087.145741, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1337, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1701578087.145741, "nlink": 1, "path": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:10 Sunday 03 December 2023 04:34:53 +0000 (0:00:00.220) 0:04:19.555 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:17 Sunday 03 December 2023 04:34:56 +0000 (0:00:02.677) 0:04:22.233 ******* ok: [sut] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.007994", "end": "2023-12-03 04:34:56.225760", "rc": 0, "start": "2023-12-03 04:34:56.217766" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 729322 Threads: 2 Salt: 0d 2f 29 8f fb 13 c1 cd af 6c 7c 7d db 49 3b a6 44 08 4f de cb 9d 4e 6c 2d 38 a6 0d 1e 51 a4 b4 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 94160 Salt: 49 7c cf 26 c0 25 ee a1 28 6c 18 ed 44 0e 08 3a 25 b9 1b a1 cc 34 ed 86 9a b6 56 50 3c 06 08 17 Digest: 0f a1 a8 01 a4 55 23 71 3b 9d 00 5f 7a 79 04 9d 2a 56 09 b8 6d 9a 12 f2 37 cb f9 c4 28 dc 38 da TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:23 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.218) 0:04:22.451 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:32 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.022) 0:04:22.474 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:45 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.021) 0:04:22.496 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:51 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.020) 0:04:22.516 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:56 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.019) 0:04:22.535 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:69 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.016) 0:04:22.552 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:81 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.016) 0:04:22.568 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:94 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.018) 0:04:22.586 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d /dev/sda1 /tmp/storage_testp2iy_xrelukskey" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "/tmp/storage_testp2iy_xrelukskey" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:106 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.020) 0:04:22.607 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:114 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.018) 0:04:22.626 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:122 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.024) 0:04:22.651 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:131 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.024) 0:04:22.675 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:140 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.021) 0:04:22.697 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:8 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.018) 0:04:22.715 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:14 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.017) 0:04:22.733 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:21 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.016) 0:04:22.749 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:28 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.018) 0:04:22.767 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:35 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.020) 0:04:22.788 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:45 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.017) 0:04:22.805 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:54 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.019) 0:04:22.825 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:63 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.017) 0:04:22.842 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:72 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.016) 0:04:22.859 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:81 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.015) 0:04:22.874 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:3 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.016) 0:04:22.891 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:11 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.019) 0:04:22.910 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:20 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.020) 0:04:22.931 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:28 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.017) 0:04:22.948 ******* ok: [sut] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:32 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.022) 0:04:22.971 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:46 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.021) 0:04:22.993 ******* skipping: [sut] => {} TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:50 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.018) 0:04:23.012 ******* skipping: [sut] => {} TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:54 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.018) 0:04:23.030 ******* skipping: [sut] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:58 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.023) 0:04:23.053 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:68 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.022) 0:04:23.075 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:72 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.017) 0:04:23.093 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:77 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.016) 0:04:23.109 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:83 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.016) 0:04:23.125 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:88 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.014) 0:04:23.140 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:96 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.016) 0:04:23.156 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:104 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.015) 0:04:23.171 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:109 Sunday 03 December 2023 04:34:56 +0000 (0:00:00.015) 0:04:23.186 ******* skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:113 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.014) 0:04:23.201 ******* skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:117 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.014) 0:04:23.216 ******* skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:121 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.014) 0:04:23.231 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:129 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.017) 0:04:23.248 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:138 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.015) 0:04:23.263 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:142 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.015) 0:04:23.279 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:150 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.020) 0:04:23.299 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:156 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.017) 0:04:23.316 ******* ok: [sut] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:160 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.020) 0:04:23.337 ******* ok: [sut] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:164 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.019) 0:04:23.357 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:5 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.047) 0:04:23.404 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:13 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.017) 0:04:23.422 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:18 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.016) 0:04:23.438 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:27 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.016) 0:04:23.454 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:35 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.020) 0:04:23.475 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:41 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.021) 0:04:23.496 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:47 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.027) 0:04:23.523 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:27 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.020) 0:04:23.544 ******* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:44 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.022) 0:04:23.567 ******* TASK [Clean up variable namespace] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:54 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.021) 0:04:23.589 ******* ok: [sut] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:329 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.025) 0:04:23.614 ******* ok: [sut] => { "changed": false, "path": "/tmp/storage_testp2iy_xrelukskey", "state": "absent" } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:339 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.216) 0:04:23.831 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml for sut TASK [Store global variable value copy] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:4 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.025) 0:04:23.856 ******* ok: [sut] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:10 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.020) 0:04:23.877 ******* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.021) 0:04:23.898 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.025) 0:04:23.924 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.025) 0:04:23.950 ******* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.043) 0:04:23.994 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.017) 0:04:24.011 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.016) 0:04:24.027 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.015) 0:04:24.043 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.018) 0:04:24.061 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 03 December 2023 04:34:57 +0000 (0:00:00.034) 0:04:24.096 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 03 December 2023 04:35:00 +0000 (0:00:02.613) 0:04:26.710 ******* ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 03 December 2023 04:35:00 +0000 (0:00:00.025) 0:04:26.735 ******* ok: [sut] => { "storage_volumes": [] } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 03 December 2023 04:35:00 +0000 (0:00:00.022) 0:04:26.758 ******* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 03 December 2023 04:35:02 +0000 (0:00:01.904) 0:04:28.662 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 03 December 2023 04:35:02 +0000 (0:00:00.031) 0:04:28.693 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 03 December 2023 04:35:02 +0000 (0:00:00.026) 0:04:28.719 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 03 December 2023 04:35:02 +0000 (0:00:00.018) 0:04:28.738 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 03 December 2023 04:35:02 +0000 (0:00:00.026) 0:04:28.765 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx lvm2 TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 03 December 2023 04:35:05 +0000 (0:00:02.611) 0:04:31.377 ******* ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 03 December 2023 04:35:07 +0000 (0:00:02.051) 0:04:33.429 ******* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 03 December 2023 04:35:07 +0000 (0:00:00.025) 0:04:33.454 ******* TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 03 December 2023 04:35:07 +0000 (0:00:00.014) 0:04:33.469 ******* fatal: [sut]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [linux-system-roles.storage : Failed message] ***************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:106 Sunday 03 December 2023 04:35:09 +0000 (0:00:02.064) 0:04:35.533 ******* fatal: [sut]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'test1' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 03 December 2023 04:35:09 +0000 (0:00:00.020) 0:04:35.554 ******* TASK [Check that we failed in the role] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:29 Sunday 03 December 2023 04:35:09 +0000 (0:00:00.013) 0:04:35.567 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:34 Sunday 03 December 2023 04:35:09 +0000 (0:00:00.017) 0:04:35.584 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:45 Sunday 03 December 2023 04:35:09 +0000 (0:00:00.062) 0:04:35.647 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:357 Sunday 03 December 2023 04:35:09 +0000 (0:00:00.016) 0:04:35.663 ******* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 03 December 2023 04:35:09 +0000 (0:00:00.031) 0:04:35.695 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 03 December 2023 04:35:09 +0000 (0:00:00.026) 0:04:35.721 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 03 December 2023 04:35:09 +0000 (0:00:00.032) 0:04:35.754 ******* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 03 December 2023 04:35:09 +0000 (0:00:00.043) 0:04:35.797 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 03 December 2023 04:35:09 +0000 (0:00:00.016) 0:04:35.814 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 03 December 2023 04:35:09 +0000 (0:00:00.016) 0:04:35.830 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 03 December 2023 04:35:09 +0000 (0:00:00.014) 0:04:35.845 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 03 December 2023 04:35:09 +0000 (0:00:00.014) 0:04:35.859 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 03 December 2023 04:35:09 +0000 (0:00:00.032) 0:04:35.891 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 03 December 2023 04:35:12 +0000 (0:00:02.638) 0:04:38.530 ******* ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 03 December 2023 04:35:12 +0000 (0:00:00.027) 0:04:38.557 ******* ok: [sut] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 03 December 2023 04:35:12 +0000 (0:00:00.019) 0:04:38.577 ******* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 03 December 2023 04:35:14 +0000 (0:00:01.894) 0:04:40.471 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 03 December 2023 04:35:14 +0000 (0:00:00.034) 0:04:40.505 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 03 December 2023 04:35:14 +0000 (0:00:00.032) 0:04:40.538 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 03 December 2023 04:35:14 +0000 (0:00:00.018) 0:04:40.556 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 03 December 2023 04:35:14 +0000 (0:00:00.027) 0:04:40.584 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx lvm2 TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 03 December 2023 04:35:17 +0000 (0:00:02.615) 0:04:43.199 ******* ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 03 December 2023 04:35:19 +0000 (0:00:02.032) 0:04:45.231 ******* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 03 December 2023 04:35:19 +0000 (0:00:00.025) 0:04:45.257 ******* TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 03 December 2023 04:35:19 +0000 (0:00:00.015) 0:04:45.273 ******* changed: [sut] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "state": "mounted" } ], "packages": [ "lvm2", "e2fsprogs", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:98 Sunday 03 December 2023 04:35:28 +0000 (0:00:09.311) 0:04:54.584 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 03 December 2023 04:35:28 +0000 (0:00:00.046) 0:04:54.630 ******* TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:116 Sunday 03 December 2023 04:35:28 +0000 (0:00:00.015) 0:04:54.646 ******* ok: [sut] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "state": "mounted" } ], "packages": [ "lvm2", "e2fsprogs", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:121 Sunday 03 December 2023 04:35:28 +0000 (0:00:00.020) 0:04:54.667 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:125 Sunday 03 December 2023 04:35:28 +0000 (0:00:00.018) 0:04:54.685 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:141 Sunday 03 December 2023 04:35:28 +0000 (0:00:00.017) 0:04:54.703 ******* changed: [sut] => (item={'src': '/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:153 Sunday 03 December 2023 04:35:28 +0000 (0:00:00.225) 0:04:54.928 ******* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:158 Sunday 03 December 2023 04:35:29 +0000 (0:00:00.807) 0:04:55.736 ******* changed: [sut] => (item={'src': '/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37" } TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:170 Sunday 03 December 2023 04:35:29 +0000 (0:00:00.250) 0:04:55.986 ******* skipping: [sut] => (item={'src': '/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:185 Sunday 03 December 2023 04:35:29 +0000 (0:00:00.023) 0:04:56.010 ******* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:193 Sunday 03 December 2023 04:35:30 +0000 (0:00:00.801) 0:04:56.811 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578089.8977242, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a94ca28842add589817a1c3a30a83c78841bce26", "ctime": 1701578089.895724, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263373, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1701578089.893724, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 85, "uid": 0, "version": "861652075", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Sunday 03 December 2023 04:35:30 +0000 (0:00:00.220) 0:04:57.032 ******* changed: [sut] => (item={'backing_device': '/dev/sda1', 'name': 'luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [sut] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-7569a48d-28d0-4f69-8223-1a98e72edb37', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "password": "-", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:220 Sunday 03 December 2023 04:35:31 +0000 (0:00:00.463) 0:04:57.495 ******* ok: [sut] TASK [Verify role results] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:376 Sunday 03 December 2023 04:35:32 +0000 (0:00:00.821) 0:04:58.317 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml for sut TASK [Print out pool information] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:2 Sunday 03 December 2023 04:35:32 +0000 (0:00:00.030) 0:04:58.348 ******* ok: [sut] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:7 Sunday 03 December 2023 04:35:32 +0000 (0:00:00.024) 0:04:58.373 ******* skipping: [sut] => {} TASK [Collect info about the volumes.] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:15 Sunday 03 December 2023 04:35:32 +0000 (0:00:00.017) 0:04:58.390 ******* ok: [sut] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "7569a48d-28d0-4f69-8223-1a98e72edb37" }, "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "size": "4G", "type": "crypt", "uuid": "5744f983-ac6b-4787-9939-c52a5d9beb51" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "zHVbMb-LhEM-NyDy-agwI-gQkg-OzfF-MA4dTW" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "e7a9b339-c47b-44f9-a3db-5567a302dcca" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:20 Sunday 03 December 2023 04:35:32 +0000 (0:00:00.217) 0:04:58.608 ******* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003300", "end": "2023-12-03 04:35:32.595297", "rc": 0, "start": "2023-12-03 04:35:32.591997" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Oct 10 09:41:04 2023 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=e7a9b339-c47b-44f9-a3db-5567a302dcca / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:25 Sunday 03 December 2023 04:35:32 +0000 (0:00:00.218) 0:04:58.826 ******* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003255", "end": "2023-12-03 04:35:32.812180", "failed_when_result": false, "rc": 0, "start": "2023-12-03 04:35:32.808925" } STDOUT: luks-7569a48d-28d0-4f69-8223-1a98e72edb37 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:34 Sunday 03 December 2023 04:35:32 +0000 (0:00:00.211) 0:04:59.038 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool.yml for sut TASK [Set _storage_pool_tests] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool.yml:5 Sunday 03 December 2023 04:35:32 +0000 (0:00:00.035) 0:04:59.073 ******* ok: [sut] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Verify pool subset] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool.yml:18 Sunday 03 December 2023 04:35:32 +0000 (0:00:00.016) 0:04:59.089 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-volumes.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:2 Sunday 03 December 2023 04:35:32 +0000 (0:00:00.034) 0:04:59.124 ******* ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:13 Sunday 03 December 2023 04:35:32 +0000 (0:00:00.022) 0:04:59.147 ******* ok: [sut] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:22 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.311) 0:04:59.458 ******* ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:27 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.020) 0:04:59.478 ******* ok: [sut] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:33 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.020) 0:04:59.498 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:42 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.020) 0:04:59.518 ******* ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:48 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.019) 0:04:59.537 ******* ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:54 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.018) 0:04:59.556 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:59 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.016) 0:04:59.573 ******* ok: [sut] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:73 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.075) 0:04:59.648 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml for sut TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:8 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.032) 0:04:59.680 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:14 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.016) 0:04:59.697 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:21 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.016) 0:04:59.714 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:28 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.014) 0:04:59.728 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:35 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.014) 0:04:59.743 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:45 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.016) 0:04:59.759 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:54 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.017) 0:04:59.776 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:64 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.017) 0:04:59.793 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:74 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.018) 0:04:59.812 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:85 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.015) 0:04:59.827 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:95 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.015) 0:04:59.843 ******* ok: [sut] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:76 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.015) 0:04:59.858 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-lvmraid.yml for sut TASK [Validate pool member LVM RAID settings] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-lvmraid.yml:2 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.034) 0:04:59.892 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml for sut TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:8 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.032) 0:04:59.924 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:16 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.016) 0:04:59.941 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:21 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.016) 0:04:59.957 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:29 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.016) 0:04:59.973 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:34 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.020) 0:04:59.993 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:40 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.018) 0:05:00.011 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:46 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.015) 0:05:00.027 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:79 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.018) 0:05:00.046 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-thin.yml for sut TASK [Validate pool member thinpool settings] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-thin.yml:2 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.046) 0:05:00.092 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-thin.yml for sut TASK [Get information about thinpool] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-thin.yml:8 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.039) 0:05:00.132 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-thin.yml:16 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.016) 0:05:00.148 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-thin.yml:23 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.017) 0:05:00.166 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-thin.yml:27 Sunday 03 December 2023 04:35:33 +0000 (0:00:00.018) 0:05:00.184 ******* ok: [sut] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:82 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.015) 0:05:00.200 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:5 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.034) 0:05:00.234 ******* ok: [sut] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:13 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.020) 0:05:00.254 ******* skipping: [sut] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:20 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.018) 0:05:00.273 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml for sut TASK [Set variables used by tests] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:2 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.030) 0:05:00.304 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:9 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.018) 0:05:00.323 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:18 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.018) 0:05:00.341 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:27 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.016) 0:05:00.358 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:37 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.017) 0:05:00.376 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:47 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.016) 0:05:00.392 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:27 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.017) 0:05:00.409 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:85 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.015) 0:05:00.425 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-vdo.yml for sut TASK [Validate pool member VDO settings] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-vdo.yml:2 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.041) 0:05:00.466 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml for sut TASK [Get information about VDO deduplication] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:9 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.034) 0:05:00.500 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:16 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.017) 0:05:00.517 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:22 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.016) 0:05:00.534 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:28 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.016) 0:05:00.551 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:35 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.016) 0:05:00.567 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:41 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.016) 0:05:00.583 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:47 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.015) 0:05:00.598 ******* ok: [sut] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:88 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.015) 0:05:00.614 ******* ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-volumes.yml:3 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.047) 0:05:00.662 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml for sut TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:2 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.031) 0:05:00.693 ******* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:21 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.020) 0:05:00.713 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:7 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.080) 0:05:00.793 ******* ok: [sut] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:16 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.018) 0:05:00.812 ******* ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1014697, "block_size": 4096, "block_total": 1030144, "block_used": 15447, "device": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "fstype": "xfs", "inode_available": 2093053, "inode_total": 2093056, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4156198912, "size_total": 4219469824, "uuid": "5744f983-ac6b-4787-9939-c52a5d9beb51" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1014697, "block_size": 4096, "block_total": 1030144, "block_used": 15447, "device": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "fstype": "xfs", "inode_available": 2093053, "inode_total": 2093056, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4156198912, "size_total": 4219469824, "uuid": "5744f983-ac6b-4787-9939-c52a5d9beb51" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:38 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.024) 0:05:00.837 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:51 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.016) 0:05:00.854 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:63 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.019) 0:05:00.874 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:71 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.020) 0:05:00.894 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:83 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.015) 0:05:00.910 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:95 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.016) 0:05:00.926 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:110 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.016) 0:05:00.943 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:122 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.022) 0:05:00.965 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:128 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.017) 0:05:00.983 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:134 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.022) 0:05:01.006 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:146 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.017) 0:05:01.023 ******* ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:2 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.018) 0:05:01.042 ******* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:40 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.039) 0:05:01.081 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:48 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.021) 0:05:01.102 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:58 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.021) 0:05:01.123 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:71 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.015) 0:05:01.138 ******* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml:3 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.015) 0:05:01.154 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml:12 Sunday 03 December 2023 04:35:34 +0000 (0:00:00.023) 0:05:01.178 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:3 Sunday 03 December 2023 04:35:35 +0000 (0:00:00.022) 0:05:01.200 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578131.2714708, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1701578128.0194907, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1524, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1701578128.0194907, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:9 Sunday 03 December 2023 04:35:35 +0000 (0:00:00.221) 0:05:01.422 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:16 Sunday 03 December 2023 04:35:35 +0000 (0:00:00.027) 0:05:01.449 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:24 Sunday 03 December 2023 04:35:35 +0000 (0:00:00.020) 0:05:01.470 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:30 Sunday 03 December 2023 04:35:35 +0000 (0:00:00.022) 0:05:01.492 ******* ok: [sut] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:34 Sunday 03 December 2023 04:35:35 +0000 (0:00:00.019) 0:05:01.512 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:39 Sunday 03 December 2023 04:35:35 +0000 (0:00:00.017) 0:05:01.530 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:3 Sunday 03 December 2023 04:35:35 +0000 (0:00:00.021) 0:05:01.551 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578131.255471, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1701578128.2634892, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1564, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1701578128.2634892, "nlink": 1, "path": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:10 Sunday 03 December 2023 04:35:35 +0000 (0:00:00.225) 0:05:01.777 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:17 Sunday 03 December 2023 04:35:38 +0000 (0:00:02.601) 0:05:04.378 ******* ok: [sut] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.007400", "end": "2023-12-03 04:35:38.371882", "rc": 0, "start": "2023-12-03 04:35:38.364482" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 16384 MK bits: 512 MK digest: 4d ca 8e ed ec ae ed 1c c9 77 bb 2e 51 11 13 72 8e cb 71 3d MK salt: 68 1a 5b db f5 eb ed 4d 68 64 4b 6d ff ca ec 93 f4 a4 60 6f 77 69 86 e9 ea cb 73 78 66 3a 59 34 MK iterations: 94160 UUID: 7569a48d-28d0-4f69-8223-1a98e72edb37 Key Slot 0: ENABLED Iterations: 1504412 Salt: ae ab d8 e8 08 c7 d2 b8 10 c6 21 58 d2 e8 be 52 fa b3 2e 28 96 96 12 f2 09 ce fb 51 c5 2d a0 0c Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:23 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.219) 0:05:04.598 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:32 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.022) 0:05:04.620 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:45 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.024) 0:05:04.645 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:51 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.021) 0:05:04.667 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:56 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.021) 0:05:04.688 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:69 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.027) 0:05:04.716 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:81 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.025) 0:05:04.742 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:94 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.024) 0:05:04.766 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-7569a48d-28d0-4f69-8223-1a98e72edb37 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:106 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.020) 0:05:04.787 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:114 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.018) 0:05:04.805 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:122 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.020) 0:05:04.826 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:131 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.020) 0:05:04.846 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:140 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.020) 0:05:04.867 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:8 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.014) 0:05:04.882 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:14 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.014) 0:05:04.897 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:21 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.014) 0:05:04.911 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:28 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.018) 0:05:04.929 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:35 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.015) 0:05:04.945 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:45 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.017) 0:05:04.963 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:54 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.051) 0:05:05.014 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:63 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.017) 0:05:05.032 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:72 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.015) 0:05:05.047 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:81 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.016) 0:05:05.064 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:3 Sunday 03 December 2023 04:35:38 +0000 (0:00:00.016) 0:05:05.080 ******* ok: [sut] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:11 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.277) 0:05:05.357 ******* ok: [sut] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:20 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.211) 0:05:05.569 ******* ok: [sut] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:28 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.022) 0:05:05.592 ******* ok: [sut] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:32 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.017) 0:05:05.609 ******* ok: [sut] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:46 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.212) 0:05:05.822 ******* skipping: [sut] => {} TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:50 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.021) 0:05:05.843 ******* skipping: [sut] => {} TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:54 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.020) 0:05:05.864 ******* skipping: [sut] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:58 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.020) 0:05:05.885 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:68 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.021) 0:05:05.906 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:72 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.020) 0:05:05.926 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:77 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.017) 0:05:05.944 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:83 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.017) 0:05:05.961 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:88 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.016) 0:05:05.978 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:96 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.017) 0:05:05.995 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:104 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.017) 0:05:06.013 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:109 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.019) 0:05:06.032 ******* skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:113 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.018) 0:05:06.051 ******* skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:117 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.018) 0:05:06.069 ******* skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:121 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.019) 0:05:06.088 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:129 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.017) 0:05:06.106 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:138 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.020) 0:05:06.126 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:142 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.018) 0:05:06.145 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:150 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.016) 0:05:06.162 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:156 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.016) 0:05:06.179 ******* ok: [sut] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:160 Sunday 03 December 2023 04:35:39 +0000 (0:00:00.018) 0:05:06.197 ******* ok: [sut] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:164 Sunday 03 December 2023 04:35:40 +0000 (0:00:00.016) 0:05:06.214 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:5 Sunday 03 December 2023 04:35:40 +0000 (0:00:00.022) 0:05:06.236 ******* ok: [sut] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.026565", "end": "2023-12-03 04:35:40.246818", "rc": 0, "start": "2023-12-03 04:35:40.220253" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:13 Sunday 03 December 2023 04:35:40 +0000 (0:00:00.240) 0:05:06.476 ******* ok: [sut] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:18 Sunday 03 December 2023 04:35:40 +0000 (0:00:00.021) 0:05:06.497 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:27 Sunday 03 December 2023 04:35:40 +0000 (0:00:00.022) 0:05:06.520 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:35 Sunday 03 December 2023 04:35:40 +0000 (0:00:00.018) 0:05:06.538 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:41 Sunday 03 December 2023 04:35:40 +0000 (0:00:00.018) 0:05:06.556 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:47 Sunday 03 December 2023 04:35:40 +0000 (0:00:00.022) 0:05:06.579 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:27 Sunday 03 December 2023 04:35:40 +0000 (0:00:00.018) 0:05:06.598 ******* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:44 Sunday 03 December 2023 04:35:40 +0000 (0:00:00.015) 0:05:06.613 ******* TASK [Clean up variable namespace] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:54 Sunday 03 December 2023 04:35:40 +0000 (0:00:00.017) 0:05:06.631 ******* ok: [sut] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:379 Sunday 03 December 2023 04:35:40 +0000 (0:00:00.016) 0:05:06.647 ******* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 03 December 2023 04:35:40 +0000 (0:00:00.036) 0:05:06.684 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 03 December 2023 04:35:40 +0000 (0:00:00.025) 0:05:06.709 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 03 December 2023 04:35:40 +0000 (0:00:00.018) 0:05:06.728 ******* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 03 December 2023 04:35:40 +0000 (0:00:00.040) 0:05:06.768 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 03 December 2023 04:35:40 +0000 (0:00:00.015) 0:05:06.784 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 03 December 2023 04:35:40 +0000 (0:00:00.016) 0:05:06.801 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 03 December 2023 04:35:40 +0000 (0:00:00.015) 0:05:06.817 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 03 December 2023 04:35:40 +0000 (0:00:00.016) 0:05:06.833 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 03 December 2023 04:35:40 +0000 (0:00:00.034) 0:05:06.868 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 03 December 2023 04:35:43 +0000 (0:00:02.608) 0:05:09.476 ******* ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 03 December 2023 04:35:43 +0000 (0:00:00.055) 0:05:09.532 ******* ok: [sut] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 03 December 2023 04:35:43 +0000 (0:00:00.017) 0:05:09.550 ******* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 03 December 2023 04:35:45 +0000 (0:00:02.084) 0:05:11.634 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 03 December 2023 04:35:45 +0000 (0:00:00.029) 0:05:11.663 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 03 December 2023 04:35:45 +0000 (0:00:00.025) 0:05:11.689 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 03 December 2023 04:35:45 +0000 (0:00:00.015) 0:05:11.705 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 03 December 2023 04:35:45 +0000 (0:00:00.028) 0:05:11.733 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx lvm2 TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 03 December 2023 04:35:48 +0000 (0:00:02.593) 0:05:14.327 ******* ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2deffad32c\\x2dbd3d\\x2d4bf2\\x2d8ee6\\x2dfe17fb97ab0d.service": { "name": "systemd-cryptsetup@luks\\x2deffad32c\\x2dbd3d\\x2d4bf2\\x2d8ee6\\x2dfe17fb97ab0d.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 03 December 2023 04:35:50 +0000 (0:00:02.061) 0:05:16.388 ******* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2deffad32c\\x2dbd3d\\x2d4bf2\\x2d8ee6\\x2dfe17fb97ab0d.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 03 December 2023 04:35:50 +0000 (0:00:00.025) 0:05:16.414 ******* changed: [sut] => (item=systemd-cryptsetup@luks\x2deffad32c\x2dbd3d\x2d4bf2\x2d8ee6\x2dfe17fb97ab0d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2deffad32c\\x2dbd3d\\x2d4bf2\\x2d8ee6\\x2dfe17fb97ab0d.service", "name": "systemd-cryptsetup@luks\\x2deffad32c\\x2dbd3d\\x2d4bf2\\x2d8ee6\\x2dfe17fb97ab0d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-udevd-kernel.socket tmp.mount cryptsetup-pre.target systemd-journald.socket -.mount \"system-systemd\\\\x2dcryptsetup.slice\" dev-sda1.device", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target \"blockdev@dev-mapper-luks\\\\x2deffad32c\\\\x2dbd3d\\\\x2d4bf2\\\\x2d8ee6\\\\x2dfe17fb97ab0d.target\"", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d /dev/sda1 /tmp/storage_testp2iy_xrelukskey ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d /dev/sda1 /tmp/storage_testp2iy_xrelukskey ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-effad32c-bd3d-4bf2-8ee6-fe17fb97ab0d ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2deffad32c\\x2dbd3d\\x2d4bf2\\x2d8ee6\\x2dfe17fb97ab0d.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2deffad32c\\x2dbd3d\\x2d4bf2\\x2d8ee6\\x2dfe17fb97ab0d.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "14732", "LimitNPROCSoft": "14732", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14732", "LimitSIGPENDINGSoft": "14732", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2deffad32c\\\\x2dbd3d\\\\x2d4bf2\\\\x2d8ee6\\\\x2dfe17fb97ab0d.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\" tmp.mount -.mount", "RequiresMountsFor": "/tmp/storage_testp2iy_xrelukskey", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2023-12-03 04:35:30 UTC", "StateChangeTimestampMonotonic": "1046547434", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "4419", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2deffad32c\\\\x2dbd3d\\\\x2d4bf2\\\\x2d8ee6\\\\x2dfe17fb97ab0d.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 03 December 2023 04:35:51 +0000 (0:00:00.829) 0:05:17.244 ******* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "e2fsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:98 Sunday 03 December 2023 04:35:53 +0000 (0:00:02.163) 0:05:19.407 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 03 December 2023 04:35:53 +0000 (0:00:00.016) 0:05:19.424 ******* changed: [sut] => (item=systemd-cryptsetup@luks\x2deffad32c\x2dbd3d\x2d4bf2\x2d8ee6\x2dfe17fb97ab0d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2deffad32c\\x2dbd3d\\x2d4bf2\\x2d8ee6\\x2dfe17fb97ab0d.service", "name": "systemd-cryptsetup@luks\\x2deffad32c\\x2dbd3d\\x2d4bf2\\x2d8ee6\\x2dfe17fb97ab0d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2deffad32c\\x2dbd3d\\x2d4bf2\\x2d8ee6\\x2dfe17fb97ab0d.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2deffad32c\\x2dbd3d\\x2d4bf2\\x2d8ee6\\x2dfe17fb97ab0d.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2deffad32c\\x2dbd3d\\x2d4bf2\\x2d8ee6\\x2dfe17fb97ab0d.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "486428672", "LimitMEMLOCKSoft": "486428672", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "14732", "LimitNPROCSoft": "14732", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14732", "LimitSIGPENDINGSoft": "14732", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2deffad32c\\x2dbd3d\\x2d4bf2\\x2d8ee6\\x2dfe17fb97ab0d.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2deffad32c\\\\x2dbd3d\\\\x2d4bf2\\\\x2d8ee6\\\\x2dfe17fb97ab0d.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "4419", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:116 Sunday 03 December 2023 04:35:54 +0000 (0:00:00.832) 0:05:20.257 ******* ok: [sut] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "e2fsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:121 Sunday 03 December 2023 04:35:54 +0000 (0:00:00.021) 0:05:20.279 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:125 Sunday 03 December 2023 04:35:54 +0000 (0:00:00.021) 0:05:20.300 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:141 Sunday 03 December 2023 04:35:54 +0000 (0:00:00.020) 0:05:20.320 ******* TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:153 Sunday 03 December 2023 04:35:54 +0000 (0:00:00.017) 0:05:20.337 ******* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:158 Sunday 03 December 2023 04:35:54 +0000 (0:00:00.789) 0:05:21.127 ******* ok: [sut] => (item={'src': '/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37" } TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:170 Sunday 03 December 2023 04:35:55 +0000 (0:00:00.219) 0:05:21.346 ******* skipping: [sut] => (item={'src': '/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:185 Sunday 03 December 2023 04:35:55 +0000 (0:00:00.022) 0:05:21.368 ******* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:193 Sunday 03 December 2023 04:35:55 +0000 (0:00:00.789) 0:05:22.158 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578131.2704709, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "5c4a230e44cca0d5a933df0616a7c4b2f362f4de", "ctime": 1701578131.253471, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263373, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1701578131.252471, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "3435316783", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Sunday 03 December 2023 04:35:56 +0000 (0:00:00.221) 0:05:22.379 ******* TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:220 Sunday 03 December 2023 04:35:56 +0000 (0:00:00.015) 0:05:22.394 ******* ok: [sut] TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:393 Sunday 03 December 2023 04:35:57 +0000 (0:00:00.804) 0:05:23.198 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify role results] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:400 Sunday 03 December 2023 04:35:57 +0000 (0:00:00.055) 0:05:23.254 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml for sut TASK [Print out pool information] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:2 Sunday 03 December 2023 04:35:57 +0000 (0:00:00.032) 0:05:23.286 ******* ok: [sut] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:7 Sunday 03 December 2023 04:35:57 +0000 (0:00:00.024) 0:05:23.310 ******* skipping: [sut] => {} TASK [Collect info about the volumes.] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:15 Sunday 03 December 2023 04:35:57 +0000 (0:00:00.015) 0:05:23.326 ******* ok: [sut] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "7569a48d-28d0-4f69-8223-1a98e72edb37" }, "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "size": "4G", "type": "crypt", "uuid": "5744f983-ac6b-4787-9939-c52a5d9beb51" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "zHVbMb-LhEM-NyDy-agwI-gQkg-OzfF-MA4dTW" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "e7a9b339-c47b-44f9-a3db-5567a302dcca" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:20 Sunday 03 December 2023 04:35:57 +0000 (0:00:00.217) 0:05:23.543 ******* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003327", "end": "2023-12-03 04:35:57.529153", "rc": 0, "start": "2023-12-03 04:35:57.525826" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Oct 10 09:41:04 2023 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=e7a9b339-c47b-44f9-a3db-5567a302dcca / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:25 Sunday 03 December 2023 04:35:57 +0000 (0:00:00.211) 0:05:23.755 ******* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003258", "end": "2023-12-03 04:35:57.741441", "failed_when_result": false, "rc": 0, "start": "2023-12-03 04:35:57.738183" } STDOUT: luks-7569a48d-28d0-4f69-8223-1a98e72edb37 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:34 Sunday 03 December 2023 04:35:57 +0000 (0:00:00.213) 0:05:23.968 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool.yml for sut TASK [Set _storage_pool_tests] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool.yml:5 Sunday 03 December 2023 04:35:57 +0000 (0:00:00.034) 0:05:24.003 ******* ok: [sut] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Verify pool subset] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool.yml:18 Sunday 03 December 2023 04:35:57 +0000 (0:00:00.017) 0:05:24.021 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-volumes.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:2 Sunday 03 December 2023 04:35:57 +0000 (0:00:00.038) 0:05:24.059 ******* ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:13 Sunday 03 December 2023 04:35:57 +0000 (0:00:00.026) 0:05:24.086 ******* ok: [sut] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:22 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.219) 0:05:24.305 ******* ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:27 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.023) 0:05:24.328 ******* ok: [sut] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:33 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.025) 0:05:24.354 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:42 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.025) 0:05:24.379 ******* ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:48 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.027) 0:05:24.406 ******* ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:54 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.023) 0:05:24.430 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:59 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.017) 0:05:24.447 ******* ok: [sut] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:73 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.026) 0:05:24.473 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml for sut TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:8 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.033) 0:05:24.506 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:14 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.017) 0:05:24.524 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:21 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.018) 0:05:24.542 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:28 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.017) 0:05:24.559 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:35 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.017) 0:05:24.576 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:45 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.016) 0:05:24.593 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:54 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.020) 0:05:24.614 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:64 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.017) 0:05:24.632 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:74 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.017) 0:05:24.649 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:85 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.018) 0:05:24.668 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:95 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.017) 0:05:24.685 ******* ok: [sut] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:76 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.016) 0:05:24.701 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-lvmraid.yml for sut TASK [Validate pool member LVM RAID settings] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-lvmraid.yml:2 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.035) 0:05:24.737 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml for sut TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:8 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.035) 0:05:24.773 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:16 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.018) 0:05:24.791 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:21 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.016) 0:05:24.808 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:29 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.017) 0:05:24.826 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:34 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.020) 0:05:24.846 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:40 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.017) 0:05:24.863 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:46 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.016) 0:05:24.879 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:79 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.020) 0:05:24.900 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-thin.yml for sut TASK [Validate pool member thinpool settings] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-thin.yml:2 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.042) 0:05:24.942 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-thin.yml for sut TASK [Get information about thinpool] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-thin.yml:8 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.077) 0:05:25.020 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-thin.yml:16 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.018) 0:05:25.038 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-thin.yml:23 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.017) 0:05:25.055 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-thin.yml:27 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.017) 0:05:25.073 ******* ok: [sut] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:82 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.017) 0:05:25.090 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:5 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.042) 0:05:25.132 ******* ok: [sut] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:13 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.020) 0:05:25.153 ******* skipping: [sut] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:20 Sunday 03 December 2023 04:35:58 +0000 (0:00:00.021) 0:05:25.175 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml for sut TASK [Set variables used by tests] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:2 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.034) 0:05:25.209 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:9 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.022) 0:05:25.231 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:18 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.020) 0:05:25.252 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:27 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.016) 0:05:25.269 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:37 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.016) 0:05:25.285 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:47 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.017) 0:05:25.303 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:27 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.016) 0:05:25.319 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:85 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.018) 0:05:25.337 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-vdo.yml for sut TASK [Validate pool member VDO settings] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-vdo.yml:2 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.037) 0:05:25.375 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml for sut TASK [Get information about VDO deduplication] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:9 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.041) 0:05:25.416 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:16 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.023) 0:05:25.439 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:22 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.019) 0:05:25.459 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:28 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.019) 0:05:25.478 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:35 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.020) 0:05:25.498 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:41 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.018) 0:05:25.517 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:47 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.020) 0:05:25.538 ******* ok: [sut] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:88 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.018) 0:05:25.556 ******* ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-volumes.yml:3 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.016) 0:05:25.573 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml for sut TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:2 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.033) 0:05:25.606 ******* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:21 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.025) 0:05:25.632 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:7 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.083) 0:05:25.715 ******* ok: [sut] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:16 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.021) 0:05:25.737 ******* ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1014697, "block_size": 4096, "block_total": 1030144, "block_used": 15447, "device": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "fstype": "xfs", "inode_available": 2093053, "inode_total": 2093056, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4156198912, "size_total": 4219469824, "uuid": "5744f983-ac6b-4787-9939-c52a5d9beb51" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1014697, "block_size": 4096, "block_total": 1030144, "block_used": 15447, "device": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "fstype": "xfs", "inode_available": 2093053, "inode_total": 2093056, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4156198912, "size_total": 4219469824, "uuid": "5744f983-ac6b-4787-9939-c52a5d9beb51" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:38 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.027) 0:05:25.764 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:51 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.018) 0:05:25.782 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:63 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.022) 0:05:25.805 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:71 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.024) 0:05:25.829 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:83 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.016) 0:05:25.846 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:95 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.017) 0:05:25.863 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:110 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.017) 0:05:25.881 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:122 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.021) 0:05:25.903 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:128 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.017) 0:05:25.920 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:134 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.016) 0:05:25.936 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:146 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.017) 0:05:25.954 ******* ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:2 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.017) 0:05:25.972 ******* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:40 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.043) 0:05:26.015 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:48 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.021) 0:05:26.037 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:58 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.019) 0:05:26.057 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:71 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.016) 0:05:26.073 ******* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml:3 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.020) 0:05:26.094 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml:12 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.023) 0:05:26.118 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:3 Sunday 03 December 2023 04:35:59 +0000 (0:00:00.069) 0:05:26.187 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578138.3704274, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1701578128.0194907, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1524, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1701578128.0194907, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:9 Sunday 03 December 2023 04:36:00 +0000 (0:00:00.220) 0:05:26.407 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:16 Sunday 03 December 2023 04:36:00 +0000 (0:00:00.022) 0:05:26.430 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:24 Sunday 03 December 2023 04:36:00 +0000 (0:00:00.016) 0:05:26.446 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:30 Sunday 03 December 2023 04:36:00 +0000 (0:00:00.019) 0:05:26.466 ******* ok: [sut] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:34 Sunday 03 December 2023 04:36:00 +0000 (0:00:00.018) 0:05:26.484 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:39 Sunday 03 December 2023 04:36:00 +0000 (0:00:00.016) 0:05:26.501 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:3 Sunday 03 December 2023 04:36:00 +0000 (0:00:00.020) 0:05:26.521 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578153.124337, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1701578128.2634892, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1564, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1701578128.2634892, "nlink": 1, "path": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:10 Sunday 03 December 2023 04:36:00 +0000 (0:00:00.221) 0:05:26.743 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:17 Sunday 03 December 2023 04:36:03 +0000 (0:00:02.634) 0:05:29.378 ******* ok: [sut] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.008107", "end": "2023-12-03 04:36:03.374148", "rc": 0, "start": "2023-12-03 04:36:03.366041" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 16384 MK bits: 512 MK digest: 4d ca 8e ed ec ae ed 1c c9 77 bb 2e 51 11 13 72 8e cb 71 3d MK salt: 68 1a 5b db f5 eb ed 4d 68 64 4b 6d ff ca ec 93 f4 a4 60 6f 77 69 86 e9 ea cb 73 78 66 3a 59 34 MK iterations: 94160 UUID: 7569a48d-28d0-4f69-8223-1a98e72edb37 Key Slot 0: ENABLED Iterations: 1504412 Salt: ae ab d8 e8 08 c7 d2 b8 10 c6 21 58 d2 e8 be 52 fa b3 2e 28 96 96 12 f2 09 ce fb 51 c5 2d a0 0c Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:23 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.227) 0:05:29.606 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:32 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.024) 0:05:29.631 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:45 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.025) 0:05:29.657 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:51 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.026) 0:05:29.683 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:56 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.023) 0:05:29.707 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:69 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.031) 0:05:29.738 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:81 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.018) 0:05:29.757 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:94 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.020) 0:05:29.777 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-7569a48d-28d0-4f69-8223-1a98e72edb37 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:106 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.024) 0:05:29.801 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:114 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.026) 0:05:29.828 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:122 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.023) 0:05:29.851 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:131 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.025) 0:05:29.877 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:140 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.024) 0:05:29.901 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:8 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.015) 0:05:29.917 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:14 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.017) 0:05:29.934 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:21 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.025) 0:05:29.960 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:28 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.018) 0:05:29.979 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:35 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.021) 0:05:30.001 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:45 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.017) 0:05:30.018 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:54 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.017) 0:05:30.036 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:63 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.019) 0:05:30.055 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:72 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.023) 0:05:30.079 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:81 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.020) 0:05:30.100 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:3 Sunday 03 December 2023 04:36:03 +0000 (0:00:00.020) 0:05:30.121 ******* ok: [sut] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:11 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.217) 0:05:30.338 ******* ok: [sut] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:20 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.214) 0:05:30.552 ******* ok: [sut] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:28 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.024) 0:05:30.577 ******* ok: [sut] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:32 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.017) 0:05:30.595 ******* ok: [sut] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:46 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.208) 0:05:30.804 ******* skipping: [sut] => {} TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:50 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.021) 0:05:30.825 ******* skipping: [sut] => {} TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:54 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.021) 0:05:30.847 ******* skipping: [sut] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:58 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.020) 0:05:30.868 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:68 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.019) 0:05:30.887 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:72 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.017) 0:05:30.904 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:77 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.015) 0:05:30.920 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:83 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.018) 0:05:30.938 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:88 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.015) 0:05:30.954 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:96 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.016) 0:05:30.971 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:104 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.016) 0:05:30.988 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:109 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.016) 0:05:31.004 ******* skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:113 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.015) 0:05:31.020 ******* skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:117 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.018) 0:05:31.038 ******* skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:121 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.016) 0:05:31.055 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:129 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.015) 0:05:31.071 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:138 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.015) 0:05:31.086 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:142 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.015) 0:05:31.102 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:150 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.016) 0:05:31.119 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:156 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.018) 0:05:31.137 ******* ok: [sut] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:160 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.019) 0:05:31.157 ******* ok: [sut] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:164 Sunday 03 December 2023 04:36:04 +0000 (0:00:00.018) 0:05:31.175 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:5 Sunday 03 December 2023 04:36:05 +0000 (0:00:00.023) 0:05:31.199 ******* ok: [sut] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.034670", "end": "2023-12-03 04:36:05.221249", "rc": 0, "start": "2023-12-03 04:36:05.186579" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:13 Sunday 03 December 2023 04:36:05 +0000 (0:00:00.266) 0:05:31.466 ******* ok: [sut] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:18 Sunday 03 December 2023 04:36:05 +0000 (0:00:00.021) 0:05:31.488 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:27 Sunday 03 December 2023 04:36:05 +0000 (0:00:00.024) 0:05:31.512 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:35 Sunday 03 December 2023 04:36:05 +0000 (0:00:00.017) 0:05:31.530 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:41 Sunday 03 December 2023 04:36:05 +0000 (0:00:00.017) 0:05:31.548 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:47 Sunday 03 December 2023 04:36:05 +0000 (0:00:00.019) 0:05:31.568 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:27 Sunday 03 December 2023 04:36:05 +0000 (0:00:00.017) 0:05:31.586 ******* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:44 Sunday 03 December 2023 04:36:05 +0000 (0:00:00.015) 0:05:31.602 ******* TASK [Clean up variable namespace] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:54 Sunday 03 December 2023 04:36:05 +0000 (0:00:00.015) 0:05:31.617 ******* ok: [sut] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/create-test-file.yml:12 Sunday 03 December 2023 04:36:05 +0000 (0:00:00.015) 0:05:31.633 ******* changed: [sut] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:406 Sunday 03 December 2023 04:36:05 +0000 (0:00:00.225) 0:05:31.858 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml for sut TASK [Store global variable value copy] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:4 Sunday 03 December 2023 04:36:05 +0000 (0:00:00.030) 0:05:31.889 ******* ok: [sut] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:10 Sunday 03 December 2023 04:36:05 +0000 (0:00:00.020) 0:05:31.909 ******* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 03 December 2023 04:36:05 +0000 (0:00:00.023) 0:05:31.933 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 03 December 2023 04:36:05 +0000 (0:00:00.024) 0:05:31.957 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 03 December 2023 04:36:05 +0000 (0:00:00.018) 0:05:31.975 ******* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 03 December 2023 04:36:05 +0000 (0:00:00.041) 0:05:32.017 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 03 December 2023 04:36:05 +0000 (0:00:00.016) 0:05:32.034 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 03 December 2023 04:36:05 +0000 (0:00:00.015) 0:05:32.049 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 03 December 2023 04:36:05 +0000 (0:00:00.015) 0:05:32.064 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 03 December 2023 04:36:05 +0000 (0:00:00.017) 0:05:32.081 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 03 December 2023 04:36:05 +0000 (0:00:00.032) 0:05:32.114 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 03 December 2023 04:36:08 +0000 (0:00:02.613) 0:05:34.728 ******* ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 03 December 2023 04:36:08 +0000 (0:00:00.023) 0:05:34.751 ******* ok: [sut] => { "storage_volumes": [] } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 03 December 2023 04:36:08 +0000 (0:00:00.021) 0:05:34.773 ******* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 03 December 2023 04:36:10 +0000 (0:00:02.082) 0:05:36.855 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 03 December 2023 04:36:10 +0000 (0:00:00.031) 0:05:36.887 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 03 December 2023 04:36:10 +0000 (0:00:00.025) 0:05:36.912 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 03 December 2023 04:36:10 +0000 (0:00:00.015) 0:05:36.928 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 03 December 2023 04:36:10 +0000 (0:00:00.023) 0:05:36.951 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx lvm2 TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 03 December 2023 04:36:13 +0000 (0:00:02.603) 0:05:39.554 ******* ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service": { "name": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 03 December 2023 04:36:15 +0000 (0:00:02.107) 0:05:41.662 ******* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 03 December 2023 04:36:15 +0000 (0:00:00.031) 0:05:41.693 ******* changed: [sut] => (item=systemd-cryptsetup@luks\x2d7569a48d\x2d28d0\x2d4f69\x2d8223\x2d1a98e72edb37.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "name": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket cryptsetup-pre.target \"system-systemd\\\\x2dcryptsetup.slice\" systemd-udevd-kernel.socket \"dev-mapper-foo\\\\x2dtest1.device\"", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "\"blockdev@dev-mapper-luks\\\\x2d7569a48d\\\\x2d28d0\\\\x2d4f69\\\\x2d8223\\\\x2d1a98e72edb37.target\" umount.target cryptsetup.target", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-7569a48d-28d0-4f69-8223-1a98e72edb37 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-7569a48d-28d0-4f69-8223-1a98e72edb37 /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-7569a48d-28d0-4f69-8223-1a98e72edb37 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-7569a48d-28d0-4f69-8223-1a98e72edb37 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "14732", "LimitNPROCSoft": "14732", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14732", "LimitSIGPENDINGSoft": "14732", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d7569a48d\\\\x2d28d0\\\\x2d4f69\\\\x2d8223\\\\x2d1a98e72edb37.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "\"dev-mapper-luks\\\\x2d7569a48d\\\\x2d28d0\\\\x2d4f69\\\\x2d8223\\\\x2d1a98e72edb37.device\" cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2023-12-03 04:35:53 UTC", "StateChangeTimestampMonotonic": "1069979085", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "4419", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d7569a48d\\\\x2d28d0\\\\x2d4f69\\\\x2d8223\\\\x2d1a98e72edb37.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 03 December 2023 04:36:16 +0000 (0:00:00.840) 0:05:42.534 ******* fatal: [sut]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-7569a48d-28d0-4f69-8223-1a98e72edb37' in safe mode due to encryption removal TASK [linux-system-roles.storage : Failed message] ***************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:106 Sunday 03 December 2023 04:36:18 +0000 (0:00:02.134) 0:05:44.669 ******* fatal: [sut]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-7569a48d-28d0-4f69-8223-1a98e72edb37' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'yabbadabbadoo', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 03 December 2023 04:36:18 +0000 (0:00:00.025) 0:05:44.694 ******* changed: [sut] => (item=systemd-cryptsetup@luks\x2d7569a48d\x2d28d0\x2d4f69\x2d8223\x2d1a98e72edb37.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "name": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "486428672", "LimitMEMLOCKSoft": "486428672", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "14732", "LimitNPROCSoft": "14732", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14732", "LimitSIGPENDINGSoft": "14732", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d7569a48d\\\\x2d28d0\\\\x2d4f69\\\\x2d8223\\\\x2d1a98e72edb37.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "\"dev-mapper-luks\\\\x2d7569a48d\\\\x2d28d0\\\\x2d4f69\\\\x2d8223\\\\x2d1a98e72edb37.device\" cryptsetup.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2023-12-03 04:35:53 UTC", "StateChangeTimestampMonotonic": "1069979085", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "4419", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:29 Sunday 03 December 2023 04:36:19 +0000 (0:00:00.830) 0:05:45.525 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:34 Sunday 03 December 2023 04:36:19 +0000 (0:00:00.018) 0:05:45.543 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:45 Sunday 03 December 2023 04:36:19 +0000 (0:00:00.023) 0:05:45.567 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-data-preservation.yml:11 Sunday 03 December 2023 04:36:19 +0000 (0:00:00.016) 0:05:45.583 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578165.6302605, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1701578165.6302605, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1701578165.6302605, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2393360823", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-data-preservation.yml:16 Sunday 03 December 2023 04:36:19 +0000 (0:00:00.217) 0:05:45.801 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:429 Sunday 03 December 2023 04:36:19 +0000 (0:00:00.019) 0:05:45.820 ******* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 03 December 2023 04:36:19 +0000 (0:00:00.043) 0:05:45.863 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 03 December 2023 04:36:19 +0000 (0:00:00.021) 0:05:45.885 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 03 December 2023 04:36:19 +0000 (0:00:00.019) 0:05:45.905 ******* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 03 December 2023 04:36:19 +0000 (0:00:00.040) 0:05:45.946 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 03 December 2023 04:36:19 +0000 (0:00:00.017) 0:05:45.963 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 03 December 2023 04:36:19 +0000 (0:00:00.022) 0:05:45.985 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 03 December 2023 04:36:19 +0000 (0:00:00.018) 0:05:46.004 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 03 December 2023 04:36:19 +0000 (0:00:00.016) 0:05:46.020 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 03 December 2023 04:36:19 +0000 (0:00:00.046) 0:05:46.067 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 03 December 2023 04:36:22 +0000 (0:00:02.616) 0:05:48.684 ******* ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 03 December 2023 04:36:22 +0000 (0:00:00.022) 0:05:48.707 ******* ok: [sut] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 03 December 2023 04:36:22 +0000 (0:00:00.020) 0:05:48.728 ******* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 03 December 2023 04:36:24 +0000 (0:00:02.085) 0:05:50.813 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 03 December 2023 04:36:24 +0000 (0:00:00.069) 0:05:50.883 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 03 December 2023 04:36:24 +0000 (0:00:00.031) 0:05:50.914 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 03 December 2023 04:36:24 +0000 (0:00:00.019) 0:05:50.934 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 03 December 2023 04:36:24 +0000 (0:00:00.027) 0:05:50.962 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx lvm2 TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 03 December 2023 04:36:27 +0000 (0:00:02.597) 0:05:53.559 ******* ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service": { "name": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 03 December 2023 04:36:29 +0000 (0:00:02.051) 0:05:55.610 ******* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 03 December 2023 04:36:29 +0000 (0:00:00.028) 0:05:55.639 ******* changed: [sut] => (item=systemd-cryptsetup@luks\x2d7569a48d\x2d28d0\x2d4f69\x2d8223\x2d1a98e72edb37.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "name": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-udevd-kernel.socket systemd-journald.socket cryptsetup-pre.target \"system-systemd\\\\x2dcryptsetup.slice\" \"dev-mapper-foo\\\\x2dtest1.device\"", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target \"blockdev@dev-mapper-luks\\\\x2d7569a48d\\\\x2d28d0\\\\x2d4f69\\\\x2d8223\\\\x2d1a98e72edb37.target\" cryptsetup.target", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-7569a48d-28d0-4f69-8223-1a98e72edb37 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-7569a48d-28d0-4f69-8223-1a98e72edb37 /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-7569a48d-28d0-4f69-8223-1a98e72edb37 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-7569a48d-28d0-4f69-8223-1a98e72edb37 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "14732", "LimitNPROCSoft": "14732", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14732", "LimitSIGPENDINGSoft": "14732", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d7569a48d\\\\x2d28d0\\\\x2d4f69\\\\x2d8223\\\\x2d1a98e72edb37.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target \"dev-mapper-luks\\\\x2d7569a48d\\\\x2d28d0\\\\x2d4f69\\\\x2d8223\\\\x2d1a98e72edb37.device\"", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2023-12-03 04:35:53 UTC", "StateChangeTimestampMonotonic": "1069979085", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "4419", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d7569a48d\\\\x2d28d0\\\\x2d4f69\\\\x2d8223\\\\x2d1a98e72edb37.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 03 December 2023 04:36:30 +0000 (0:00:00.828) 0:05:56.467 ******* changed: [sut] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "password": "-", "state": "absent" } ], "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:98 Sunday 03 December 2023 04:36:33 +0000 (0:00:03.079) 0:05:59.547 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 03 December 2023 04:36:33 +0000 (0:00:00.018) 0:05:59.566 ******* changed: [sut] => (item=systemd-cryptsetup@luks\x2d7569a48d\x2d28d0\x2d4f69\x2d8223\x2d1a98e72edb37.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "name": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "486428672", "LimitMEMLOCKSoft": "486428672", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "14732", "LimitNPROCSoft": "14732", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14732", "LimitSIGPENDINGSoft": "14732", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d7569a48d\\\\x2d28d0\\\\x2d4f69\\\\x2d8223\\\\x2d1a98e72edb37.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2023-12-03 04:35:53 UTC", "StateChangeTimestampMonotonic": "1069979085", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "4419", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:116 Sunday 03 December 2023 04:36:34 +0000 (0:00:00.831) 0:06:00.398 ******* ok: [sut] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:121 Sunday 03 December 2023 04:36:34 +0000 (0:00:00.021) 0:06:00.419 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:125 Sunday 03 December 2023 04:36:34 +0000 (0:00:00.019) 0:06:00.438 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:141 Sunday 03 December 2023 04:36:34 +0000 (0:00:00.019) 0:06:00.457 ******* changed: [sut] => (item={'src': '/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-7569a48d-28d0-4f69-8223-1a98e72edb37" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:153 Sunday 03 December 2023 04:36:34 +0000 (0:00:00.227) 0:06:00.685 ******* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:158 Sunday 03 December 2023 04:36:35 +0000 (0:00:00.819) 0:06:01.505 ******* changed: [sut] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:170 Sunday 03 December 2023 04:36:35 +0000 (0:00:00.401) 0:06:01.907 ******* skipping: [sut] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:185 Sunday 03 December 2023 04:36:35 +0000 (0:00:00.023) 0:06:01.930 ******* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:193 Sunday 03 December 2023 04:36:36 +0000 (0:00:01.080) 0:06:03.011 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578131.2704709, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "5c4a230e44cca0d5a933df0616a7c4b2f362f4de", "ctime": 1701578131.253471, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263373, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1701578131.252471, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "3435316783", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Sunday 03 December 2023 04:36:37 +0000 (0:00:00.282) 0:06:03.293 ******* changed: [sut] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-7569a48d-28d0-4f69-8223-1a98e72edb37', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:220 Sunday 03 December 2023 04:36:37 +0000 (0:00:00.375) 0:06:03.669 ******* ok: [sut] TASK [Verify role results] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:445 Sunday 03 December 2023 04:36:38 +0000 (0:00:01.368) 0:06:05.037 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml for sut TASK [Print out pool information] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:2 Sunday 03 December 2023 04:36:38 +0000 (0:00:00.035) 0:06:05.072 ******* ok: [sut] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:7 Sunday 03 December 2023 04:36:38 +0000 (0:00:00.077) 0:06:05.150 ******* skipping: [sut] => {} TASK [Collect info about the volumes.] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:15 Sunday 03 December 2023 04:36:38 +0000 (0:00:00.016) 0:06:05.166 ******* ok: [sut] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "4911efeb-ef81-4686-bca9-b47723d0f4ea" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "zHVbMb-LhEM-NyDy-agwI-gQkg-OzfF-MA4dTW" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "e7a9b339-c47b-44f9-a3db-5567a302dcca" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:20 Sunday 03 December 2023 04:36:39 +0000 (0:00:00.323) 0:06:05.490 ******* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003206", "end": "2023-12-03 04:36:39.512533", "rc": 0, "start": "2023-12-03 04:36:39.509327" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Oct 10 09:41:04 2023 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=e7a9b339-c47b-44f9-a3db-5567a302dcca / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:25 Sunday 03 December 2023 04:36:39 +0000 (0:00:00.248) 0:06:05.738 ******* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003224", "end": "2023-12-03 04:36:39.722456", "failed_when_result": false, "rc": 0, "start": "2023-12-03 04:36:39.719232" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:34 Sunday 03 December 2023 04:36:39 +0000 (0:00:00.209) 0:06:05.948 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool.yml for sut TASK [Set _storage_pool_tests] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool.yml:5 Sunday 03 December 2023 04:36:39 +0000 (0:00:00.031) 0:06:05.980 ******* ok: [sut] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Verify pool subset] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool.yml:18 Sunday 03 December 2023 04:36:39 +0000 (0:00:00.018) 0:06:05.998 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-volumes.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:2 Sunday 03 December 2023 04:36:39 +0000 (0:00:00.037) 0:06:06.036 ******* ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:13 Sunday 03 December 2023 04:36:39 +0000 (0:00:00.026) 0:06:06.063 ******* ok: [sut] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:22 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.212) 0:06:06.275 ******* ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:27 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.018) 0:06:06.294 ******* ok: [sut] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:33 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.021) 0:06:06.315 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:42 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.021) 0:06:06.337 ******* ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:48 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.019) 0:06:06.357 ******* ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:54 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.019) 0:06:06.376 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:59 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.015) 0:06:06.392 ******* ok: [sut] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:73 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.024) 0:06:06.416 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml for sut TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:8 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.031) 0:06:06.448 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:14 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.016) 0:06:06.464 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:21 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.014) 0:06:06.479 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:28 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.015) 0:06:06.494 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:35 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.017) 0:06:06.512 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:45 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.016) 0:06:06.528 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:54 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.015) 0:06:06.544 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:64 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.015) 0:06:06.559 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:74 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.016) 0:06:06.575 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:85 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.015) 0:06:06.591 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:95 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.017) 0:06:06.608 ******* ok: [sut] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:76 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.016) 0:06:06.625 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-lvmraid.yml for sut TASK [Validate pool member LVM RAID settings] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-lvmraid.yml:2 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.034) 0:06:06.659 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml for sut TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:8 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.037) 0:06:06.697 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:16 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.017) 0:06:06.714 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:21 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.017) 0:06:06.731 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:29 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.016) 0:06:06.748 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:34 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.016) 0:06:06.765 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:40 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.018) 0:06:06.783 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:46 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.017) 0:06:06.801 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:79 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.016) 0:06:06.818 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-thin.yml for sut TASK [Validate pool member thinpool settings] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-thin.yml:2 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.032) 0:06:06.851 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-thin.yml for sut TASK [Get information about thinpool] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-thin.yml:8 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.033) 0:06:06.884 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-thin.yml:16 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.016) 0:06:06.900 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-thin.yml:23 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.015) 0:06:06.916 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-thin.yml:27 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.014) 0:06:06.930 ******* ok: [sut] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:82 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.015) 0:06:06.945 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:5 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.070) 0:06:07.016 ******* ok: [sut] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:13 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.019) 0:06:07.036 ******* skipping: [sut] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:20 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.019) 0:06:07.055 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml for sut TASK [Set variables used by tests] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:2 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.031) 0:06:07.087 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:9 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.018) 0:06:07.105 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:18 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.019) 0:06:07.125 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:27 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.016) 0:06:07.141 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:37 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.016) 0:06:07.157 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:47 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.016) 0:06:07.174 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:27 Sunday 03 December 2023 04:36:40 +0000 (0:00:00.017) 0:06:07.191 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:85 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.015) 0:06:07.206 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-vdo.yml for sut TASK [Validate pool member VDO settings] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-vdo.yml:2 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.035) 0:06:07.242 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml for sut TASK [Get information about VDO deduplication] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:9 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.033) 0:06:07.275 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:16 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.015) 0:06:07.291 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:22 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.015) 0:06:07.306 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:28 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.016) 0:06:07.322 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:35 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.016) 0:06:07.339 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:41 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.017) 0:06:07.356 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:47 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.014) 0:06:07.371 ******* ok: [sut] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:88 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.014) 0:06:07.386 ******* ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-volumes.yml:3 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.014) 0:06:07.401 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml for sut TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:2 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.030) 0:06:07.432 ******* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:21 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.021) 0:06:07.454 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:7 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.082) 0:06:07.536 ******* ok: [sut] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:16 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.019) 0:06:07.555 ******* ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1016730, "block_size": 4096, "block_total": 1032192, "block_used": 15462, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4164526080, "size_total": 4227858432, "uuid": "4911efeb-ef81-4686-bca9-b47723d0f4ea" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1016730, "block_size": 4096, "block_total": 1032192, "block_used": 15462, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4164526080, "size_total": 4227858432, "uuid": "4911efeb-ef81-4686-bca9-b47723d0f4ea" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:38 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.027) 0:06:07.582 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:51 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.016) 0:06:07.599 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:63 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.020) 0:06:07.619 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:71 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.019) 0:06:07.639 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:83 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.016) 0:06:07.655 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:95 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.014) 0:06:07.670 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:110 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.017) 0:06:07.687 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:122 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.019) 0:06:07.707 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:128 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.014) 0:06:07.721 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:134 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.014) 0:06:07.736 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:146 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.016) 0:06:07.753 ******* ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:2 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.014) 0:06:07.768 ******* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:40 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.035) 0:06:07.803 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:48 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.020) 0:06:07.824 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:58 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.024) 0:06:07.848 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:71 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.016) 0:06:07.865 ******* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml:3 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.016) 0:06:07.881 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml:12 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.022) 0:06:07.904 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:3 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.028) 0:06:07.933 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578197.409066, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1701578193.2400916, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1674, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1701578193.2400916, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:9 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.217) 0:06:08.151 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:16 Sunday 03 December 2023 04:36:41 +0000 (0:00:00.023) 0:06:08.174 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:24 Sunday 03 December 2023 04:36:42 +0000 (0:00:00.053) 0:06:08.227 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:30 Sunday 03 December 2023 04:36:42 +0000 (0:00:00.021) 0:06:08.248 ******* ok: [sut] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:34 Sunday 03 December 2023 04:36:42 +0000 (0:00:00.017) 0:06:08.266 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:39 Sunday 03 December 2023 04:36:42 +0000 (0:00:00.016) 0:06:08.282 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:3 Sunday 03 December 2023 04:36:42 +0000 (0:00:00.018) 0:06:08.301 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:10 Sunday 03 December 2023 04:36:42 +0000 (0:00:00.016) 0:06:08.317 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:17 Sunday 03 December 2023 04:36:44 +0000 (0:00:02.619) 0:06:10.936 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:23 Sunday 03 December 2023 04:36:44 +0000 (0:00:00.020) 0:06:10.957 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:32 Sunday 03 December 2023 04:36:44 +0000 (0:00:00.018) 0:06:10.976 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:45 Sunday 03 December 2023 04:36:44 +0000 (0:00:00.026) 0:06:11.003 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:51 Sunday 03 December 2023 04:36:44 +0000 (0:00:00.018) 0:06:11.021 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:56 Sunday 03 December 2023 04:36:44 +0000 (0:00:00.019) 0:06:11.041 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:69 Sunday 03 December 2023 04:36:44 +0000 (0:00:00.019) 0:06:11.061 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:81 Sunday 03 December 2023 04:36:44 +0000 (0:00:00.017) 0:06:11.078 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:94 Sunday 03 December 2023 04:36:44 +0000 (0:00:00.018) 0:06:11.096 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:106 Sunday 03 December 2023 04:36:44 +0000 (0:00:00.026) 0:06:11.123 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:114 Sunday 03 December 2023 04:36:44 +0000 (0:00:00.025) 0:06:11.149 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:122 Sunday 03 December 2023 04:36:44 +0000 (0:00:00.017) 0:06:11.166 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:131 Sunday 03 December 2023 04:36:44 +0000 (0:00:00.019) 0:06:11.186 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:140 Sunday 03 December 2023 04:36:45 +0000 (0:00:00.017) 0:06:11.204 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:8 Sunday 03 December 2023 04:36:45 +0000 (0:00:00.015) 0:06:11.219 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:14 Sunday 03 December 2023 04:36:45 +0000 (0:00:00.016) 0:06:11.235 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:21 Sunday 03 December 2023 04:36:45 +0000 (0:00:00.016) 0:06:11.251 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:28 Sunday 03 December 2023 04:36:45 +0000 (0:00:00.017) 0:06:11.269 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:35 Sunday 03 December 2023 04:36:45 +0000 (0:00:00.020) 0:06:11.289 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:45 Sunday 03 December 2023 04:36:45 +0000 (0:00:00.017) 0:06:11.307 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:54 Sunday 03 December 2023 04:36:45 +0000 (0:00:00.017) 0:06:11.324 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:63 Sunday 03 December 2023 04:36:45 +0000 (0:00:00.017) 0:06:11.342 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:72 Sunday 03 December 2023 04:36:45 +0000 (0:00:00.018) 0:06:11.360 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:81 Sunday 03 December 2023 04:36:45 +0000 (0:00:00.018) 0:06:11.379 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:3 Sunday 03 December 2023 04:36:45 +0000 (0:00:00.018) 0:06:11.398 ******* ok: [sut] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:11 Sunday 03 December 2023 04:36:45 +0000 (0:00:00.210) 0:06:11.609 ******* ok: [sut] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:20 Sunday 03 December 2023 04:36:45 +0000 (0:00:00.216) 0:06:11.826 ******* ok: [sut] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:28 Sunday 03 December 2023 04:36:45 +0000 (0:00:00.022) 0:06:11.848 ******* ok: [sut] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:32 Sunday 03 December 2023 04:36:45 +0000 (0:00:00.018) 0:06:11.867 ******* ok: [sut] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:46 Sunday 03 December 2023 04:36:45 +0000 (0:00:00.212) 0:06:12.079 ******* skipping: [sut] => {} TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:50 Sunday 03 December 2023 04:36:45 +0000 (0:00:00.019) 0:06:12.099 ******* skipping: [sut] => {} TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:54 Sunday 03 December 2023 04:36:45 +0000 (0:00:00.018) 0:06:12.118 ******* skipping: [sut] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:58 Sunday 03 December 2023 04:36:45 +0000 (0:00:00.018) 0:06:12.136 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:68 Sunday 03 December 2023 04:36:45 +0000 (0:00:00.019) 0:06:12.156 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:72 Sunday 03 December 2023 04:36:45 +0000 (0:00:00.015) 0:06:12.171 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:77 Sunday 03 December 2023 04:36:45 +0000 (0:00:00.015) 0:06:12.187 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:83 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.018) 0:06:12.205 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:88 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.016) 0:06:12.222 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:96 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.017) 0:06:12.239 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:104 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.019) 0:06:12.259 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:109 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.018) 0:06:12.278 ******* skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:113 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.016) 0:06:12.294 ******* skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:117 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.015) 0:06:12.309 ******* skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:121 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.015) 0:06:12.325 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:129 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.014) 0:06:12.340 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:138 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.017) 0:06:12.357 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:142 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.016) 0:06:12.374 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:150 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.016) 0:06:12.390 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:156 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.015) 0:06:12.405 ******* ok: [sut] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:160 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.018) 0:06:12.424 ******* ok: [sut] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:164 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.016) 0:06:12.441 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:5 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.025) 0:06:12.466 ******* ok: [sut] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.034075", "end": "2023-12-03 04:36:46.487284", "rc": 0, "start": "2023-12-03 04:36:46.453209" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:13 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.247) 0:06:12.714 ******* ok: [sut] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:18 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.022) 0:06:12.736 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:27 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.021) 0:06:12.757 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:35 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.016) 0:06:12.774 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:41 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.016) 0:06:12.791 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:47 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.016) 0:06:12.808 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:27 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.017) 0:06:12.825 ******* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:44 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.017) 0:06:12.842 ******* TASK [Clean up variable namespace] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:54 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.047) 0:06:12.890 ******* ok: [sut] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/create-test-file.yml:12 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.015) 0:06:12.906 ******* changed: [sut] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:451 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.220) 0:06:13.127 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml for sut TASK [Store global variable value copy] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:4 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.031) 0:06:13.159 ******* ok: [sut] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:10 Sunday 03 December 2023 04:36:46 +0000 (0:00:00.019) 0:06:13.178 ******* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 03 December 2023 04:36:47 +0000 (0:00:00.023) 0:06:13.202 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 03 December 2023 04:36:47 +0000 (0:00:00.022) 0:06:13.225 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 03 December 2023 04:36:47 +0000 (0:00:00.020) 0:06:13.246 ******* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 03 December 2023 04:36:47 +0000 (0:00:00.046) 0:06:13.293 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 03 December 2023 04:36:47 +0000 (0:00:00.016) 0:06:13.309 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 03 December 2023 04:36:47 +0000 (0:00:00.015) 0:06:13.324 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 03 December 2023 04:36:47 +0000 (0:00:00.014) 0:06:13.339 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 03 December 2023 04:36:47 +0000 (0:00:00.015) 0:06:13.355 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 03 December 2023 04:36:47 +0000 (0:00:00.034) 0:06:13.389 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 03 December 2023 04:36:49 +0000 (0:00:02.585) 0:06:15.974 ******* ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 03 December 2023 04:36:49 +0000 (0:00:00.021) 0:06:15.996 ******* ok: [sut] => { "storage_volumes": [] } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 03 December 2023 04:36:49 +0000 (0:00:00.026) 0:06:16.022 ******* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 03 December 2023 04:36:51 +0000 (0:00:01.964) 0:06:17.987 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 03 December 2023 04:36:51 +0000 (0:00:00.034) 0:06:18.022 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 03 December 2023 04:36:51 +0000 (0:00:00.029) 0:06:18.052 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 03 December 2023 04:36:51 +0000 (0:00:00.020) 0:06:18.072 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 03 December 2023 04:36:51 +0000 (0:00:00.029) 0:06:18.101 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx lvm2 TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 03 December 2023 04:36:54 +0000 (0:00:02.627) 0:06:20.729 ******* ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service": { "name": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 03 December 2023 04:36:57 +0000 (0:00:03.071) 0:06:23.800 ******* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 03 December 2023 04:36:57 +0000 (0:00:00.032) 0:06:23.833 ******* changed: [sut] => (item=systemd-cryptsetup@luks\x2d7569a48d\x2d28d0\x2d4f69\x2d8223\x2d1a98e72edb37.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "name": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "\"system-systemd\\\\x2dcryptsetup.slice\" systemd-journald.socket \"dev-mapper-foo\\\\x2dtest1.device\" systemd-udevd-kernel.socket cryptsetup-pre.target", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target \"blockdev@dev-mapper-luks\\\\x2d7569a48d\\\\x2d28d0\\\\x2d4f69\\\\x2d8223\\\\x2d1a98e72edb37.target\"", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-7569a48d-28d0-4f69-8223-1a98e72edb37", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-7569a48d-28d0-4f69-8223-1a98e72edb37 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-7569a48d-28d0-4f69-8223-1a98e72edb37 /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-7569a48d-28d0-4f69-8223-1a98e72edb37 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-7569a48d-28d0-4f69-8223-1a98e72edb37 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "14732", "LimitNPROCSoft": "14732", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14732", "LimitSIGPENDINGSoft": "14732", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d7569a48d\\\\x2d28d0\\\\x2d4f69\\\\x2d8223\\\\x2d1a98e72edb37.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2023-12-03 04:35:53 UTC", "StateChangeTimestampMonotonic": "1069979085", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "4419", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d7569a48d\\\\x2d28d0\\\\x2d4f69\\\\x2d8223\\\\x2d1a98e72edb37.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 03 December 2023 04:36:58 +0000 (0:00:00.828) 0:06:24.662 ******* fatal: [sut]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [linux-system-roles.storage : Failed message] ***************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:106 Sunday 03 December 2023 04:37:00 +0000 (0:00:02.035) 0:06:26.697 ******* fatal: [sut]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'yabbadabbadoo', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 03 December 2023 04:37:00 +0000 (0:00:00.021) 0:06:26.719 ******* changed: [sut] => (item=systemd-cryptsetup@luks\x2d7569a48d\x2d28d0\x2d4f69\x2d8223\x2d1a98e72edb37.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "name": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "486428672", "LimitMEMLOCKSoft": "486428672", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "14732", "LimitNPROCSoft": "14732", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14732", "LimitSIGPENDINGSoft": "14732", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d7569a48d\\x2d28d0\\x2d4f69\\x2d8223\\x2d1a98e72edb37.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d7569a48d\\\\x2d28d0\\\\x2d4f69\\\\x2d8223\\\\x2d1a98e72edb37.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "4419", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:29 Sunday 03 December 2023 04:37:01 +0000 (0:00:00.827) 0:06:27.546 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:34 Sunday 03 December 2023 04:37:01 +0000 (0:00:00.020) 0:06:27.567 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-failed.yml:45 Sunday 03 December 2023 04:37:01 +0000 (0:00:00.026) 0:06:27.594 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-data-preservation.yml:11 Sunday 03 December 2023 04:37:01 +0000 (0:00:00.017) 0:06:27.612 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578206.900008, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1701578206.900008, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1701578206.900008, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3755737579", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-data-preservation.yml:16 Sunday 03 December 2023 04:37:01 +0000 (0:00:00.223) 0:06:27.835 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:474 Sunday 03 December 2023 04:37:01 +0000 (0:00:00.019) 0:06:27.854 ******* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 03 December 2023 04:37:01 +0000 (0:00:00.053) 0:06:27.908 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 03 December 2023 04:37:01 +0000 (0:00:00.026) 0:06:27.934 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 03 December 2023 04:37:01 +0000 (0:00:00.023) 0:06:27.958 ******* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 03 December 2023 04:37:01 +0000 (0:00:00.044) 0:06:28.002 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 03 December 2023 04:37:01 +0000 (0:00:00.016) 0:06:28.019 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 03 December 2023 04:37:01 +0000 (0:00:00.016) 0:06:28.035 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 03 December 2023 04:37:01 +0000 (0:00:00.016) 0:06:28.051 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 03 December 2023 04:37:01 +0000 (0:00:00.015) 0:06:28.067 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 03 December 2023 04:37:01 +0000 (0:00:00.033) 0:06:28.101 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 03 December 2023 04:37:04 +0000 (0:00:02.599) 0:06:30.700 ******* ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 03 December 2023 04:37:04 +0000 (0:00:00.020) 0:06:30.721 ******* ok: [sut] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 03 December 2023 04:37:04 +0000 (0:00:00.015) 0:06:30.736 ******* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 03 December 2023 04:37:06 +0000 (0:00:01.951) 0:06:32.688 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 03 December 2023 04:37:06 +0000 (0:00:00.030) 0:06:32.719 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 03 December 2023 04:37:06 +0000 (0:00:00.025) 0:06:32.745 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 03 December 2023 04:37:06 +0000 (0:00:00.015) 0:06:32.761 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 03 December 2023 04:37:06 +0000 (0:00:00.023) 0:06:32.784 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx lvm2 TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 03 December 2023 04:37:09 +0000 (0:00:02.588) 0:06:35.373 ******* ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 03 December 2023 04:37:11 +0000 (0:00:02.096) 0:06:37.469 ******* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 03 December 2023 04:37:11 +0000 (0:00:00.029) 0:06:37.499 ******* TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 03 December 2023 04:37:11 +0000 (0:00:00.015) 0:06:37.514 ******* changed: [sut] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "state": "mounted" } ], "packages": [ "e2fsprogs", "cryptsetup", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:98 Sunday 03 December 2023 04:37:22 +0000 (0:00:11.074) 0:06:48.588 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 03 December 2023 04:37:22 +0000 (0:00:00.016) 0:06:48.605 ******* TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:116 Sunday 03 December 2023 04:37:22 +0000 (0:00:00.014) 0:06:48.620 ******* ok: [sut] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "state": "mounted" } ], "packages": [ "e2fsprogs", "cryptsetup", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:121 Sunday 03 December 2023 04:37:22 +0000 (0:00:00.020) 0:06:48.640 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:125 Sunday 03 December 2023 04:37:22 +0000 (0:00:00.019) 0:06:48.659 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:141 Sunday 03 December 2023 04:37:22 +0000 (0:00:00.017) 0:06:48.677 ******* changed: [sut] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:153 Sunday 03 December 2023 04:37:22 +0000 (0:00:00.232) 0:06:48.909 ******* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:158 Sunday 03 December 2023 04:37:23 +0000 (0:00:00.799) 0:06:49.709 ******* changed: [sut] => (item={'src': '/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e" } TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:170 Sunday 03 December 2023 04:37:23 +0000 (0:00:00.249) 0:06:49.958 ******* skipping: [sut] => (item={'src': '/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:185 Sunday 03 December 2023 04:37:23 +0000 (0:00:00.023) 0:06:49.982 ******* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:193 Sunday 03 December 2023 04:37:24 +0000 (0:00:00.791) 0:06:50.774 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578199.7210517, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1701578197.4110658, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263374, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1701578197.402066, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "2325014855", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Sunday 03 December 2023 04:37:24 +0000 (0:00:00.214) 0:06:50.988 ******* changed: [sut] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "password": "-", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:220 Sunday 03 December 2023 04:37:25 +0000 (0:00:00.233) 0:06:51.222 ******* ok: [sut] TASK [Verify role results] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:490 Sunday 03 December 2023 04:37:25 +0000 (0:00:00.796) 0:06:52.018 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml for sut TASK [Print out pool information] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:2 Sunday 03 December 2023 04:37:25 +0000 (0:00:00.036) 0:06:52.055 ******* ok: [sut] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:7 Sunday 03 December 2023 04:37:25 +0000 (0:00:00.019) 0:06:52.075 ******* skipping: [sut] => {} TASK [Collect info about the volumes.] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:15 Sunday 03 December 2023 04:37:25 +0000 (0:00:00.016) 0:06:52.092 ******* ok: [sut] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "ff37eaf6-a11b-46cf-a7e9-278bc7ec186e" }, "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "size": "4G", "type": "crypt", "uuid": "fd9e137e-1d47-4e16-a5e8-3b4a73c28284" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "zHVbMb-LhEM-NyDy-agwI-gQkg-OzfF-MA4dTW" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "e7a9b339-c47b-44f9-a3db-5567a302dcca" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:20 Sunday 03 December 2023 04:37:26 +0000 (0:00:00.215) 0:06:52.307 ******* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003294", "end": "2023-12-03 04:37:26.291280", "rc": 0, "start": "2023-12-03 04:37:26.287986" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Oct 10 09:41:04 2023 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=e7a9b339-c47b-44f9-a3db-5567a302dcca / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:25 Sunday 03 December 2023 04:37:26 +0000 (0:00:00.208) 0:06:52.515 ******* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003290", "end": "2023-12-03 04:37:26.502176", "failed_when_result": false, "rc": 0, "start": "2023-12-03 04:37:26.498886" } STDOUT: luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:34 Sunday 03 December 2023 04:37:26 +0000 (0:00:00.212) 0:06:52.728 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool.yml for sut TASK [Set _storage_pool_tests] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool.yml:5 Sunday 03 December 2023 04:37:26 +0000 (0:00:00.034) 0:06:52.763 ******* ok: [sut] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Verify pool subset] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool.yml:18 Sunday 03 December 2023 04:37:26 +0000 (0:00:00.015) 0:06:52.779 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-volumes.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:2 Sunday 03 December 2023 04:37:26 +0000 (0:00:00.074) 0:06:52.853 ******* ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:13 Sunday 03 December 2023 04:37:26 +0000 (0:00:00.024) 0:06:52.877 ******* ok: [sut] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:22 Sunday 03 December 2023 04:37:26 +0000 (0:00:00.210) 0:06:53.088 ******* ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:27 Sunday 03 December 2023 04:37:26 +0000 (0:00:00.020) 0:06:53.108 ******* ok: [sut] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:33 Sunday 03 December 2023 04:37:26 +0000 (0:00:00.021) 0:06:53.129 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:42 Sunday 03 December 2023 04:37:26 +0000 (0:00:00.020) 0:06:53.150 ******* ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:48 Sunday 03 December 2023 04:37:26 +0000 (0:00:00.018) 0:06:53.169 ******* ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:54 Sunday 03 December 2023 04:37:26 +0000 (0:00:00.019) 0:06:53.188 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:59 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.016) 0:06:53.205 ******* ok: [sut] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:73 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.024) 0:06:53.229 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml for sut TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:8 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.029) 0:06:53.258 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:14 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.017) 0:06:53.276 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:21 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.017) 0:06:53.293 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:28 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.015) 0:06:53.309 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:35 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.016) 0:06:53.325 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:45 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.015) 0:06:53.341 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:54 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.014) 0:06:53.356 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:64 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.017) 0:06:53.373 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:74 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.015) 0:06:53.389 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:85 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.015) 0:06:53.404 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-md.yml:95 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.015) 0:06:53.419 ******* ok: [sut] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:76 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.014) 0:06:53.434 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-lvmraid.yml for sut TASK [Validate pool member LVM RAID settings] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-lvmraid.yml:2 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.035) 0:06:53.469 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml for sut TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:8 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.032) 0:06:53.502 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:16 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.016) 0:06:53.518 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:21 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.016) 0:06:53.535 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:29 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.016) 0:06:53.551 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:34 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.016) 0:06:53.567 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:40 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.016) 0:06:53.584 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-lvmraid.yml:46 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.014) 0:06:53.599 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:79 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.015) 0:06:53.614 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-thin.yml for sut TASK [Validate pool member thinpool settings] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-thin.yml:2 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.036) 0:06:53.650 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-thin.yml for sut TASK [Get information about thinpool] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-thin.yml:8 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.033) 0:06:53.684 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-thin.yml:16 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.016) 0:06:53.701 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-thin.yml:23 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.016) 0:06:53.717 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-thin.yml:27 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.015) 0:06:53.732 ******* ok: [sut] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:82 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.015) 0:06:53.748 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:5 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.037) 0:06:53.786 ******* ok: [sut] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:13 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.019) 0:06:53.805 ******* skipping: [sut] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:20 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.020) 0:06:53.825 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml for sut TASK [Set variables used by tests] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:2 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.029) 0:06:53.855 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:9 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.020) 0:06:53.875 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:18 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.020) 0:06:53.896 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:27 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.015) 0:06:53.911 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:37 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.015) 0:06:53.926 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-crypttab.yml:47 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.015) 0:06:53.941 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-encryption.yml:27 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.015) 0:06:53.957 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:85 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.017) 0:06:53.974 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-vdo.yml for sut TASK [Validate pool member VDO settings] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-members-vdo.yml:2 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.070) 0:06:54.045 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml for sut TASK [Get information about VDO deduplication] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:9 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.034) 0:06:54.079 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:16 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.017) 0:06:54.097 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:22 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.016) 0:06:54.114 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:28 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.016) 0:06:54.131 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:35 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.017) 0:06:54.148 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:41 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.016) 0:06:54.164 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-pool-member-vdo.yml:47 Sunday 03 December 2023 04:37:27 +0000 (0:00:00.017) 0:06:54.181 ******* ok: [sut] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-members.yml:88 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.017) 0:06:54.199 ******* ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-pool-volumes.yml:3 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.015) 0:06:54.215 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml for sut TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:2 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.031) 0:06:54.246 ******* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:21 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.020) 0:06:54.266 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:7 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.082) 0:06:54.349 ******* ok: [sut] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:16 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.019) 0:06:54.368 ******* ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1012662, "block_size": 4096, "block_total": 1028096, "block_used": 15434, "device": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "fstype": "xfs", "inode_available": 2088957, "inode_total": 2088960, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4147863552, "size_total": 4211081216, "uuid": "fd9e137e-1d47-4e16-a5e8-3b4a73c28284" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1012662, "block_size": 4096, "block_total": 1028096, "block_used": 15434, "device": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "fstype": "xfs", "inode_available": 2088957, "inode_total": 2088960, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4147863552, "size_total": 4211081216, "uuid": "fd9e137e-1d47-4e16-a5e8-3b4a73c28284" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:38 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.025) 0:06:54.394 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:51 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.016) 0:06:54.411 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:63 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.022) 0:06:54.433 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:71 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.020) 0:06:54.453 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:83 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.016) 0:06:54.470 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:95 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.015) 0:06:54.486 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:110 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.014) 0:06:54.501 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:122 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.020) 0:06:54.522 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:128 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.018) 0:06:54.541 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:134 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.017) 0:06:54.558 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:146 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.020) 0:06:54.579 ******* ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:2 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.015) 0:06:54.595 ******* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:40 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.035) 0:06:54.630 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:48 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.023) 0:06:54.654 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:58 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.022) 0:06:54.676 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:71 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.019) 0:06:54.696 ******* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml:3 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.019) 0:06:54.715 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml:12 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.027) 0:06:54.742 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:3 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.026) 0:06:54.769 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578245.0007932, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1701578242.0328097, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1674, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1701578242.0328097, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:9 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.216) 0:06:54.985 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:16 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.022) 0:06:55.007 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:24 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.016) 0:06:55.024 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:30 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.020) 0:06:55.045 ******* ok: [sut] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:34 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.020) 0:06:55.065 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:39 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.021) 0:06:55.086 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:3 Sunday 03 December 2023 04:37:28 +0000 (0:00:00.021) 0:06:55.108 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578244.9847932, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1701578242.2598083, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1760, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1701578242.2598083, "nlink": 1, "path": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:10 Sunday 03 December 2023 04:37:29 +0000 (0:00:00.220) 0:06:55.329 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:17 Sunday 03 December 2023 04:37:31 +0000 (0:00:02.631) 0:06:57.960 ******* ok: [sut] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.008243", "end": "2023-12-03 04:37:31.956768", "rc": 0, "start": "2023-12-03 04:37:31.948525" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: ff37eaf6-a11b-46cf-a7e9-278bc7ec186e Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 730507 Threads: 2 Salt: 40 a5 d1 ef f9 14 24 a6 3e 40 1e ef fc 3a d8 dd 35 36 56 23 f7 50 23 39 2a d7 60 86 04 21 cb 16 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 93891 Salt: 73 ce 35 41 1f f4 f2 36 77 79 8c 18 10 cc cd 9b 78 ed 0b 42 56 04 c1 6f 85 6c f2 ed bd bc 23 12 Digest: c2 12 f5 a7 95 81 82 52 84 4d 92 42 c9 5f 9f 16 db 52 49 58 09 b0 cc c0 ec 29 28 60 01 6a c8 88 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:23 Sunday 03 December 2023 04:37:31 +0000 (0:00:00.223) 0:06:58.184 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:32 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.021) 0:06:58.206 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:45 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.022) 0:06:58.228 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:51 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.019) 0:06:58.248 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:56 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.019) 0:06:58.268 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:69 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.017) 0:06:58.285 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:81 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.017) 0:06:58.303 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:94 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.016) 0:06:58.319 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:106 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.022) 0:06:58.342 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:114 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.021) 0:06:58.364 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:122 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.020) 0:06:58.384 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:131 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.061) 0:06:58.446 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:140 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.023) 0:06:58.470 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:8 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.016) 0:06:58.486 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:14 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.016) 0:06:58.503 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:21 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.017) 0:06:58.520 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:28 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.016) 0:06:58.537 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:35 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.018) 0:06:58.555 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:45 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.015) 0:06:58.571 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:54 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.016) 0:06:58.587 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:63 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.016) 0:06:58.604 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:72 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.015) 0:06:58.619 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:81 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.016) 0:06:58.636 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:3 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.017) 0:06:58.653 ******* ok: [sut] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:11 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.214) 0:06:58.867 ******* ok: [sut] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:20 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.216) 0:06:59.084 ******* ok: [sut] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:28 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.024) 0:06:59.108 ******* ok: [sut] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:32 Sunday 03 December 2023 04:37:32 +0000 (0:00:00.017) 0:06:59.126 ******* ok: [sut] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:46 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.211) 0:06:59.337 ******* skipping: [sut] => {} TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:50 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.021) 0:06:59.359 ******* skipping: [sut] => {} TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:54 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.021) 0:06:59.380 ******* skipping: [sut] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:58 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.020) 0:06:59.401 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:68 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.019) 0:06:59.421 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:72 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.017) 0:06:59.438 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:77 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.017) 0:06:59.456 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:83 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.017) 0:06:59.473 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:88 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.016) 0:06:59.490 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:96 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.017) 0:06:59.508 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:104 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.017) 0:06:59.526 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:109 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.017) 0:06:59.543 ******* skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:113 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.017) 0:06:59.561 ******* skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:117 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.018) 0:06:59.579 ******* skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:121 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.016) 0:06:59.596 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:129 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.017) 0:06:59.613 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:138 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.015) 0:06:59.628 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:142 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.016) 0:06:59.644 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:150 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.015) 0:06:59.660 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:156 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.015) 0:06:59.675 ******* ok: [sut] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:160 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.017) 0:06:59.693 ******* ok: [sut] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:164 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.019) 0:06:59.713 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:5 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.023) 0:06:59.737 ******* ok: [sut] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.036740", "end": "2023-12-03 04:37:33.758199", "rc": 0, "start": "2023-12-03 04:37:33.721459" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:13 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.251) 0:06:59.988 ******* ok: [sut] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:18 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.024) 0:07:00.012 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:27 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.030) 0:07:00.042 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:35 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.021) 0:07:00.064 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:41 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.020) 0:07:00.085 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:47 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.018) 0:07:00.104 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:27 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.026) 0:07:00.130 ******* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:44 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.017) 0:07:00.147 ******* TASK [Clean up variable namespace] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:54 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.017) 0:07:00.165 ******* ok: [sut] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:493 Sunday 03 December 2023 04:37:33 +0000 (0:00:00.019) 0:07:00.185 ******* TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 03 December 2023 04:37:34 +0000 (0:00:00.064) 0:07:00.249 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 03 December 2023 04:37:34 +0000 (0:00:00.031) 0:07:00.281 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 03 December 2023 04:37:34 +0000 (0:00:00.095) 0:07:00.376 ******* skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_37.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_37.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 03 December 2023 04:37:34 +0000 (0:00:00.046) 0:07:00.422 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 03 December 2023 04:37:34 +0000 (0:00:00.016) 0:07:00.439 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 03 December 2023 04:37:34 +0000 (0:00:00.017) 0:07:00.456 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 03 December 2023 04:37:34 +0000 (0:00:00.015) 0:07:00.472 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 03 December 2023 04:37:34 +0000 (0:00:00.019) 0:07:00.491 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 03 December 2023 04:37:34 +0000 (0:00:00.039) 0:07:00.531 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 03 December 2023 04:37:36 +0000 (0:00:02.617) 0:07:03.148 ******* ok: [sut] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 03 December 2023 04:37:36 +0000 (0:00:00.017) 0:07:03.166 ******* ok: [sut] => { "storage_volumes": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 03 December 2023 04:37:36 +0000 (0:00:00.018) 0:07:03.185 ******* ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 03 December 2023 04:37:39 +0000 (0:00:02.092) 0:07:05.277 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 03 December 2023 04:37:39 +0000 (0:00:00.031) 0:07:05.309 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 03 December 2023 04:37:39 +0000 (0:00:00.026) 0:07:05.335 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 03 December 2023 04:37:39 +0000 (0:00:00.016) 0:07:05.352 ******* skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 03 December 2023 04:37:39 +0000 (0:00:00.024) 0:07:05.376 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 03 December 2023 04:37:41 +0000 (0:00:02.647) 0:07:08.024 ******* ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 03 December 2023 04:37:43 +0000 (0:00:02.041) 0:07:10.066 ******* ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 03 December 2023 04:37:43 +0000 (0:00:00.027) 0:07:10.093 ******* TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 03 December 2023 04:37:43 +0000 (0:00:00.016) 0:07:10.110 ******* changed: [sut] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "state": "absent" } ], "packages": [ "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=zHVbMb-LhEM-NyDy-agwI-gQkg-OzfF-MA4dTW", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:98 Sunday 03 December 2023 04:37:46 +0000 (0:00:03.078) 0:07:13.188 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 03 December 2023 04:37:47 +0000 (0:00:00.017) 0:07:13.205 ******* TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:116 Sunday 03 December 2023 04:37:47 +0000 (0:00:00.014) 0:07:13.219 ******* ok: [sut] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "state": "absent" } ], "packages": [ "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=zHVbMb-LhEM-NyDy-agwI-gQkg-OzfF-MA4dTW", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:121 Sunday 03 December 2023 04:37:47 +0000 (0:00:00.020) 0:07:13.239 ******* ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:125 Sunday 03 December 2023 04:37:47 +0000 (0:00:00.017) 0:07:13.257 ******* ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=zHVbMb-LhEM-NyDy-agwI-gQkg-OzfF-MA4dTW", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:141 Sunday 03 December 2023 04:37:47 +0000 (0:00:00.018) 0:07:13.275 ******* changed: [sut] => (item={'src': '/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:153 Sunday 03 December 2023 04:37:47 +0000 (0:00:00.225) 0:07:13.501 ******* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:158 Sunday 03 December 2023 04:37:48 +0000 (0:00:00.809) 0:07:14.310 ******* TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:170 Sunday 03 December 2023 04:37:48 +0000 (0:00:00.016) 0:07:14.326 ******* TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:185 Sunday 03 December 2023 04:37:48 +0000 (0:00:00.016) 0:07:14.342 ******* ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:193 Sunday 03 December 2023 04:37:48 +0000 (0:00:00.796) 0:07:15.139 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578244.9987931, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "806eb73d5b39a09ae7c99439fe476a0c1e9938da", "ctime": 1701578244.9817934, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263373, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1701578244.9807932, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "898090397", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Sunday 03 December 2023 04:37:49 +0000 (0:00:00.219) 0:07:15.358 ******* changed: [sut] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-ff37eaf6-a11b-46cf-a7e9-278bc7ec186e", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:220 Sunday 03 December 2023 04:37:49 +0000 (0:00:00.233) 0:07:15.591 ******* ok: [sut] TASK [Verify role results] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/tests_luks.yml:503 Sunday 03 December 2023 04:37:50 +0000 (0:00:00.803) 0:07:16.395 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml for sut TASK [Print out pool information] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:2 Sunday 03 December 2023 04:37:50 +0000 (0:00:00.037) 0:07:16.433 ******* skipping: [sut] => {} TASK [Print out volume information] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:7 Sunday 03 December 2023 04:37:50 +0000 (0:00:00.020) 0:07:16.454 ******* ok: [sut] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=zHVbMb-LhEM-NyDy-agwI-gQkg-OzfF-MA4dTW", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:15 Sunday 03 December 2023 04:37:50 +0000 (0:00:00.022) 0:07:16.476 ******* ok: [sut] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "e7a9b339-c47b-44f9-a3db-5567a302dcca" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:20 Sunday 03 December 2023 04:37:50 +0000 (0:00:00.216) 0:07:16.692 ******* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003343", "end": "2023-12-03 04:37:50.676966", "rc": 0, "start": "2023-12-03 04:37:50.673623" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Oct 10 09:41:04 2023 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=e7a9b339-c47b-44f9-a3db-5567a302dcca / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:25 Sunday 03 December 2023 04:37:50 +0000 (0:00:00.211) 0:07:16.903 ******* ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003236", "end": "2023-12-03 04:37:50.888506", "failed_when_result": false, "rc": 0, "start": "2023-12-03 04:37:50.885270" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:34 Sunday 03 December 2023 04:37:50 +0000 (0:00:00.210) 0:07:17.114 ******* TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:44 Sunday 03 December 2023 04:37:50 +0000 (0:00:00.014) 0:07:17.129 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml for sut TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:2 Sunday 03 December 2023 04:37:50 +0000 (0:00:00.031) 0:07:17.161 ******* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:21 Sunday 03 December 2023 04:37:50 +0000 (0:00:00.022) 0:07:17.183 ******* included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:7 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.086) 0:07:17.269 ******* ok: [sut] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:16 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.020) 0:07:17.290 ******* ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:38 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.024) 0:07:17.314 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:51 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.016) 0:07:17.331 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:63 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.016) 0:07:17.348 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:71 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.022) 0:07:17.371 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:83 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.015) 0:07:17.387 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:95 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.016) 0:07:17.404 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:110 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.016) 0:07:17.420 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:122 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.014) 0:07:17.435 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:128 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.017) 0:07:17.452 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:134 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.016) 0:07:17.469 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-mount.yml:146 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.016) 0:07:17.485 ******* ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:2 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.014) 0:07:17.500 ******* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:40 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.034) 0:07:17.535 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:48 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.016) 0:07:17.551 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:58 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.021) 0:07:17.573 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fstab.yml:71 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.016) 0:07:17.590 ******* ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml:3 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.015) 0:07:17.605 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-fs.yml:12 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.016) 0:07:17.621 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:3 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.015) 0:07:17.637 ******* ok: [sut] => { "changed": false, "stat": { "atime": 1701578266.8456721, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1701578266.8456721, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 560, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1701578266.8456721, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:9 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.215) 0:07:17.853 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:16 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.021) 0:07:17.875 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:24 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.016) 0:07:17.891 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:30 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.013) 0:07:17.905 ******* ok: [sut] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:34 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.019) 0:07:17.925 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-device.yml:39 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.016) 0:07:17.942 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:3 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.012) 0:07:17.955 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:10 Sunday 03 December 2023 04:37:51 +0000 (0:00:00.015) 0:07:17.970 ******* ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:17 Sunday 03 December 2023 04:37:54 +0000 (0:00:02.594) 0:07:20.564 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:23 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.016) 0:07:20.580 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:32 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.014) 0:07:20.595 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:45 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.013) 0:07:20.609 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:51 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.015) 0:07:20.625 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:56 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.014) 0:07:20.640 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:69 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.012) 0:07:20.652 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:81 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.012) 0:07:20.665 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:94 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.012) 0:07:20.677 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:106 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.023) 0:07:20.700 ******* ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:114 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.018) 0:07:20.719 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:122 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.015) 0:07:20.734 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:131 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.015) 0:07:20.749 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:140 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.015) 0:07:20.764 ******* ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:8 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.015) 0:07:20.780 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:14 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.018) 0:07:20.798 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:21 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.016) 0:07:20.814 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:28 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.015) 0:07:20.829 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:35 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.014) 0:07:20.844 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:45 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.015) 0:07:20.859 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:54 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.015) 0:07:20.875 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:63 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.016) 0:07:20.892 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:72 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.016) 0:07:20.909 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-md.yml:81 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.016) 0:07:20.925 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:3 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.016) 0:07:20.942 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:11 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.015) 0:07:20.958 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:20 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.021) 0:07:20.979 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:28 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.019) 0:07:20.999 ******* ok: [sut] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:32 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.017) 0:07:21.016 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:46 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.029) 0:07:21.046 ******* skipping: [sut] => {} TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:50 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.019) 0:07:21.065 ******* skipping: [sut] => {} TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:54 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.016) 0:07:21.082 ******* skipping: [sut] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:58 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.016) 0:07:21.098 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:68 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.018) 0:07:21.117 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:72 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.015) 0:07:21.133 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:77 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.015) 0:07:21.148 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:83 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.017) 0:07:21.166 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:88 Sunday 03 December 2023 04:37:54 +0000 (0:00:00.015) 0:07:21.182 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:96 Sunday 03 December 2023 04:37:55 +0000 (0:00:00.016) 0:07:21.198 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:104 Sunday 03 December 2023 04:37:55 +0000 (0:00:00.018) 0:07:21.217 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:109 Sunday 03 December 2023 04:37:55 +0000 (0:00:00.016) 0:07:21.233 ******* skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:113 Sunday 03 December 2023 04:37:55 +0000 (0:00:00.016) 0:07:21.250 ******* skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:117 Sunday 03 December 2023 04:37:55 +0000 (0:00:00.017) 0:07:21.268 ******* skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:121 Sunday 03 December 2023 04:37:55 +0000 (0:00:00.021) 0:07:21.289 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:129 Sunday 03 December 2023 04:37:55 +0000 (0:00:00.018) 0:07:21.308 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:138 Sunday 03 December 2023 04:37:55 +0000 (0:00:00.022) 0:07:21.330 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:142 Sunday 03 December 2023 04:37:55 +0000 (0:00:00.016) 0:07:21.347 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:150 Sunday 03 December 2023 04:37:55 +0000 (0:00:00.015) 0:07:21.362 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:156 Sunday 03 December 2023 04:37:55 +0000 (0:00:00.014) 0:07:21.377 ******* ok: [sut] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:160 Sunday 03 December 2023 04:37:55 +0000 (0:00:00.017) 0:07:21.394 ******* ok: [sut] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-size.yml:164 Sunday 03 December 2023 04:37:55 +0000 (0:00:00.017) 0:07:21.412 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:5 Sunday 03 December 2023 04:37:55 +0000 (0:00:00.017) 0:07:21.430 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:13 Sunday 03 December 2023 04:37:55 +0000 (0:00:00.014) 0:07:21.445 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:18 Sunday 03 December 2023 04:37:55 +0000 (0:00:00.015) 0:07:21.460 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:27 Sunday 03 December 2023 04:37:55 +0000 (0:00:00.014) 0:07:21.475 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:35 Sunday 03 December 2023 04:37:55 +0000 (0:00:00.015) 0:07:21.490 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:41 Sunday 03 December 2023 04:37:55 +0000 (0:00:00.016) 0:07:21.506 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-cache.yml:47 Sunday 03 December 2023 04:37:55 +0000 (0:00:00.018) 0:07:21.525 ******* skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume.yml:27 Sunday 03 December 2023 04:37:55 +0000 (0:00:00.017) 0:07:21.543 ******* ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /WORKDIR/git-weekly-civnd1vj4n/tests/verify-role-results.yml:54 Sunday 03 December 2023 04:37:55 +0000 (0:00:00.019) 0:07:21.562 ******* ok: [sut] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* sut : ok=1183 changed=62 unreachable=0 failed=9 skipped=1014 rescued=9 ignored=0 Sunday 03 December 2023 04:37:55 +0000 (0:00:00.010) 0:07:21.573 ******* =============================================================================== linux-system-roles.storage : Manage the pools and volumes to match the specified state -- 11.70s /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 linux-system-roles.storage : Manage the pools and volumes to match the specified state -- 11.43s /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 linux-system-roles.storage : Manage the pools and volumes to match the specified state -- 11.07s /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 linux-system-roles.storage : Manage the pools and volumes to match the specified state -- 11.03s /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 linux-system-roles.storage : Manage the pools and volumes to match the specified state -- 10.65s /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 linux-system-roles.storage : Manage the pools and volumes to match the specified state --- 9.31s /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 linux-system-roles.storage : Make sure blivet is available -------------- 6.96s /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 linux-system-roles.storage : Make sure required packages are installed --- 4.17s /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 linux-system-roles.storage : Make sure blivet is available -------------- 3.29s /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 linux-system-roles.storage : Manage the pools and volumes to match the specified state --- 3.08s /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 linux-system-roles.storage : Manage the pools and volumes to match the specified state --- 3.08s /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 linux-system-roles.storage : Get service facts -------------------------- 3.07s /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 linux-system-roles.storage : Make sure required packages are installed --- 2.98s /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 linux-system-roles.storage : Get required packages ---------------------- 2.69s /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Ensure cryptsetup is present -------------------------------------------- 2.68s /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:10 ----- linux-system-roles.storage : Make sure required packages are installed --- 2.65s /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 linux-system-roles.storage : Make sure required packages are installed --- 2.64s /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 linux-system-roles.storage : Make sure blivet is available -------------- 2.64s /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 linux-system-roles.storage : Make sure blivet is available -------------- 2.64s /WORKDIR/git-weekly-civnd1vj4n/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Ensure cryptsetup is present -------------------------------------------- 2.63s /WORKDIR/git-weekly-civnd1vj4n/tests/test-verify-volume-encryption.yml:10 ----- ---^---^---^---^---^--- # STDERR: ---v---v---v---v---v--- ---^---^---^---^---^---