Skip to content

Conversation

@ben-grande
Copy link
Contributor

In line with the qubes module, disposable creation happens on from_appvm(). This change is intended to measure how much time it takes to get a fresh disposable object, improving the performance tests distinction of the from_appvm() phase from the execution/run*() phase.

For: QubesOS/qubes-issues#10230
For: QubesOS/qubes-issues#1512


wrapper = DispVMWrapper(app, method_dest)
method_dest = "dom0"
dispvm = app.qubesd_call(method_dest, "admin.vm.CreateDisposable")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a bad idea. Normal qubes intentionally have no permission to call admin.vm.CreateDisposable (nor any other admin.* calls), but they may be granted calls to the @dispvm target (possibly choosing different disposable template depending on the requested service).

Requiring extended admin API access just to measure time a bit better is no-go...

Copy link
Contributor Author

@ben-grande ben-grande Dec 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Background:

commit 9bb59cdd20fa551563a04346559be2f09500708a
Author: Marek Marczykowski-Górecki <[email protected]>
Date:   Sun Aug 6 12:22:47 2017 +0200

    vm: add DispVMWrapper for calling a single service in new DispVM

    This is a wrapper to use `$dispvm` target of qrexec call, just like any
    other service call in qubesadmin module - using vm.run_service().
    When running in dom0, qrexec-client-vm is not available, so DispVM needs
    to be created "manually", using appropriate Admin API call
    (admin.vm.CreateDisposable).

    QubesOS/qubes-issues#2974

I did not comprehend why using $dispvm was necessary, thought it served the necessity at that time only.

This is a bad idea. Normal qubes intentionally have no permission to call admin.vm.CreateDisposable (nor any other admin.* calls), but they may be granted calls to the @dispvm target (possibly choosing different disposable template depending on the requested service).

I'm not sure I completely understand this. What you are saying is that this is problematic when core-admin-client is running on a domU?

Requiring extended admin API access just to measure time a bit better is no-go...

What about a using the old code, maintaining the use of $dispvm*, and adding method just to create the disposable if self._method_dest.startswith("$dispvm")?

class DispVMWrapper(QubesVM):
    """Wrapper class for new DispVM, supporting only service call

    Note that when running in dom0, one need to manually kill the DispVM after
    service call ends.
    """

    def create_disposable(self):
        """Create dispvm at service call or run service directly."""
        if self._method_dest.startswith("$dispvm"):
            if self._method_dest.startswith("$dispvm:"):
                method_dest = self._method_dest[len("$dispvm:") :]
            else:
                method_dest = "dom0"
            dispvm = self.app.qubesd_call(
                method_dest, "admin.vm.CreateDisposable"
            )
            dispvm = dispvm.decode("ascii")
            self._method_dest = dispvm
        return self

    def run_service(self, service, **kwargs):
        """Create disposable if absent and run service."""
        if (
            self.app.qubesd_connection_type == "socket"
            and self._method_dest.startswith("$dispvm")
        ):
            self.create_disposable()
            # Service call may wait for session start, give it more time
            # than default 5s
            kwargs["connect_timeout"] = self.qrexec_timeout
        return super().run_service(service, **kwargs)

    def cleanup(self):
        """Cleanup after disposable usage."""
        # in 'remote' case nothing is needed, as DispVM is cleaned up
        # automatically
        if (
            self.app.qubesd_connection_type == "socket"
            and not self._method_dest.startswith("$dispvm")
        ):
            try:
                self.kill()
            except qubesadmin.exc.QubesVMNotRunningError:
                pass

    def start(self):
        """Create disposable if absent and start it."""
        if self._method_dest.startswith("$dispvm"):
            self.create_disposable()
        super().start()


class DispVM(QubesVM):
    """Disposable VM"""

    @classmethod
    def from_appvm(cls, app, appvm):
        """Returns a wrapper for calling service in a new DispVM based on given
        AppVM. If *appvm* is none, use default DispVM template"""

        if appvm:
            method_dest = "$dispvm:" + str(appvm)
        else:
            method_dest = "$dispvm"

        wrapper = DispVMWrapper(app, method_dest)
        return wrapper

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The solution above allows:

dispvm = qubesadmin.vm.DispVM.from_appvm(app, appvm).create_disposable()

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What you are saying is that this is problematic when core-admin-client is running on a domU?

Yes.

What about a using the old code, maintaining the use of $dispvm*, and adding method just to create the disposable if self._method_dest.startswith("$dispvm")?

This approach is okay, it will preserve using special dispvm target, but will allow explicitly created disposable if somebody really wants.

BTW, while at it, it would be good to change deprecated $dispvm to @dispvm.

Generation cannot help on "from_appvm()" as normal qubes don't have
permission to call "admin.vm.CreateDisposable" or other "admin.*" calls.

With this new method, retrieval of the qube object can be done directly:

  dispvm = qubesadmin.vm.DispVM.from_appvm(app, dvm).create_disposable()

For: QubesOS/qubes-issues#10230
For: QubesOS/qubes-issues#1512
Itches that it is different.
@ben-grande ben-grande marked this pull request as ready for review December 10, 2025 09:13
@ben-grande ben-grande changed the title Create disposable right on from_appvm() Add method to solely create disposable Dec 10, 2025
@ben-grande
Copy link
Contributor Author

PipelineRetryFailed

@codecov
Copy link

codecov bot commented Dec 10, 2025

Codecov Report

❌ Patch coverage is 76.92308% with 3 lines in your changes missing coverage. Please review.
✅ Project coverage is 76.15%. Comparing base (a5ea121) to head (96e818a).

Files with missing lines Patch % Lines
qubesadmin/vm/__init__.py 76.92% 3 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #401      +/-   ##
==========================================
+ Coverage   76.10%   76.15%   +0.04%     
==========================================
  Files          53       53              
  Lines        9287     9285       -2     
==========================================
+ Hits         7068     7071       +3     
+ Misses       2219     2214       -5     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@ben-grande
Copy link
Contributor Author

Openqa me please.

@qubesos-bot
Copy link

qubesos-bot commented Dec 12, 2025

OpenQA test summary

Complete test suite and dependencies: https://openqa.qubes-os.org/tests/overview?distri=qubesos&version=4.3&build=2025121204-4.3&flavor=pull-requests

Test run included the following:

New failures, excluding unstable

Compared to: https://openqa.qubes-os.org/tests/overview?distri=qubesos&version=4.3&build=2025111104-4.3&flavor=update

  • system_tests_qwt_win10_seamless@hw13

  • system_tests_audio@hw1

    • TC_20_AudioVM_Pulse_whonix-workstation-18: test_223_audio_play_hvm (error)
      qubes.exc.QubesVMError: Cannot connect to qrexec agent for 60 secon...

    • TC_20_AudioVM_Pulse_whonix-workstation-18: test_224_audio_rec_muted_hvm (error)
      qubes.exc.QubesVMError: Cannot connect to qrexec agent for 60 secon...

    • TC_20_AudioVM_Pulse_whonix-workstation-18: test_225_audio_rec_unmuted_hvm (error)
      qubes.exc.QubesVMError: Cannot connect to qrexec agent for 60 secon...

    • TC_20_AudioVM_Pulse_whonix-workstation-18: test_252_audio_playback_audiovm_switch_hvm (error)
      qubes.exc.QubesVMError: Cannot connect to qrexec agent for 60 secon...

  • system_tests_dispvm_perf@hw7

    • TC_00_DispVMPerf_fedora-42-xfce: test_900_reader (failure)
      AssertionError: '/usr/lib/qubes/tests/dispvm_perf_reader.py --templ...
  • system_tests_extra

    • TC_00_InputProxy_debian-13-xfce: test_050_mouse_late_attach (failure)
      AssertionError: unexpectedly None : Device 'test-inst-input: Test i...

    • TC_00_InputProxy_fedora-42-xfce: test_050_mouse_late_attach (failure)
      AssertionError: unexpectedly None : Device 'test-inst-input: Test i...

  • system_tests_network_updates

    • TC_10_QvmTemplate_whonix-gateway-18: test_010_template_install (error)
      subprocess.CalledProcessError: Command 'timeout=120; while ! tor-ci...
  • system_tests_guivm_vnc_gui_interactive

    • collect_logs: wait_serial (wait serial expected)
      # wait_serial expected: qr/IJVTo-\d+-/...

    • collect_logs: Failed (test died + timed out)
      # Test died: command 'curl --form [email protected] --form upn...

    • collect_logs: wait_serial (wait serial expected)
      # wait_serial expected: qr/9FXye-\d+-/...

  • system_tests_audio

    • TC_20_AudioVM_Pulse_whonix-workstation-18: test_223_audio_play_hvm (error)
      qubes.exc.QubesVMError: Cannot connect to qrexec agent for 120 seco...

    • TC_20_AudioVM_Pulse_whonix-workstation-18: test_224_audio_rec_muted_hvm (error)
      qubes.exc.QubesVMError: Cannot connect to qrexec agent for 120 seco...

    • TC_20_AudioVM_Pulse_whonix-workstation-18: test_225_audio_rec_unmuted_hvm (error)
      qubes.exc.QubesVMError: Cannot connect to qrexec agent for 120 seco...

    • TC_20_AudioVM_Pulse_whonix-workstation-18: test_252_audio_playback_audiovm_switch_hvm (error)
      qubes.exc.QubesVMError: Cannot connect to qrexec agent for 120 seco...

  • system_tests_basic_vm_qrexec_gui_ext4

Failed tests

24 failures
  • system_tests_qwt_win10_seamless@hw13

    • windows_clipboard_and_filecopy: unnamed test (unknown)
    • windows_clipboard_and_filecopy: Failed (test died)
      # Test died: no candidate needle with tag(s) 'windows-Edge-address-...
  • system_tests_audio@hw1

    • TC_20_AudioVM_Pulse_whonix-workstation-18: test_223_audio_play_hvm (error)
      qubes.exc.QubesVMError: Cannot connect to qrexec agent for 60 secon...

    • TC_20_AudioVM_Pulse_whonix-workstation-18: test_224_audio_rec_muted_hvm (error)
      qubes.exc.QubesVMError: Cannot connect to qrexec agent for 60 secon...

    • TC_20_AudioVM_Pulse_whonix-workstation-18: test_225_audio_rec_unmuted_hvm (error)
      qubes.exc.QubesVMError: Cannot connect to qrexec agent for 60 secon...

    • TC_20_AudioVM_Pulse_whonix-workstation-18: test_252_audio_playback_audiovm_switch_hvm (error)
      qubes.exc.QubesVMError: Cannot connect to qrexec agent for 60 secon...

  • system_tests_dispvm_perf@hw7

    • TC_00_DispVMPerf_debian-13-xfce: test_900_reader (failure)
      AssertionError: '/usr/lib/qubes/tests/dispvm_perf_reader.py --templ...

    • TC_00_DispVMPerf_fedora-42-xfce: test_900_reader (failure)
      AssertionError: '/usr/lib/qubes/tests/dispvm_perf_reader.py --templ...

    • TC_00_DispVMPerf_whonix-workstation-18: test_900_reader (failure)
      AssertionError: '/usr/lib/qubes/tests/dispvm_perf_reader.py --templ...

  • system_tests_extra

    • TC_00_InputProxy_debian-13-xfce: test_050_mouse_late_attach (failure)
      AssertionError: unexpectedly None : Device 'test-inst-input: Test i...

    • TC_00_InputProxy_fedora-42-xfce: test_050_mouse_late_attach (failure)
      AssertionError: unexpectedly None : Device 'test-inst-input: Test i...

  • system_tests_network_updates

    • TC_10_QvmTemplate_whonix-gateway-18: test_010_template_install (error)
      subprocess.CalledProcessError: Command 'timeout=120; while ! tor-ci...
  • system_tests_guivm_vnc_gui_interactive

    • collect_logs: wait_serial (wait serial expected)
      # wait_serial expected: qr/IJVTo-\d+-/...

    • collect_logs: Failed (test died + timed out)
      # Test died: command 'curl --form [email protected] --form upn...

    • collect_logs: wait_serial (wait serial expected)
      # wait_serial expected: qr/9FXye-\d+-/...

  • system_tests_audio

    • TC_20_AudioVM_Pulse_whonix-workstation-18: test_223_audio_play_hvm (error)
      qubes.exc.QubesVMError: Cannot connect to qrexec agent for 120 seco...

    • TC_20_AudioVM_Pulse_whonix-workstation-18: test_224_audio_rec_muted_hvm (error)
      qubes.exc.QubesVMError: Cannot connect to qrexec agent for 120 seco...

    • TC_20_AudioVM_Pulse_whonix-workstation-18: test_225_audio_rec_unmuted_hvm (error)
      qubes.exc.QubesVMError: Cannot connect to qrexec agent for 120 seco...

    • TC_20_AudioVM_Pulse_whonix-workstation-18: test_252_audio_playback_audiovm_switch_hvm (error)
      qubes.exc.QubesVMError: Cannot connect to qrexec agent for 120 seco...

  • system_tests_basic_vm_qrexec_gui_ext4

  • system_tests_whonix@hw7

    • whonixcheck: fail (unknown)
      Whonixcheck for sys-whonix failed...

    • whonixcheck: Failed (test died)
      # Test died: systemcheck failed at qubesos/tests/whonixcheck.pm lin...

  • system_tests_qwt_win11@hw13

    • windows_install: wait_serial (wait serial expected)
      # wait_serial expected: qr/install-complete-\d+-/...

    • windows_install: Failed (test died + timed out)
      # Test died: Install timed out at qubesos/tests/windows_install.pm ...

Fixed failures

Compared to: https://openqa.qubes-os.org/tests/158999#dependencies

16 fixed
  • system_tests_qwt_win10@hw13

    • windows_install: wait_serial (wait serial expected)
      # wait_serial expected: qr/kauF4-\d+-/...

    • windows_install: Failed (test died + timed out)
      # Test died: command 'script -e -c 'bash -x /usr/bin/qvm-create-win...

  • system_tests_qwt_win10_seamless@hw13

  • system_tests_dispvm_perf@hw7

  • system_tests_extra

    • TC_00_QVCTest_debian-13-xfce: test_010_screenshare (failure + cleanup)
      AssertionError: 2.4614345149565264 not less than 2.0

    • TC_00_QVCTest_whonix-gateway-18: test_010_screenshare (failure)
      AssertionError: 1 != 0 : Timeout waiting for /dev/video0 in test-in...

    • TC_00_QVCTest_whonix-workstation-18: test_010_screenshare (failure)
      AssertionError: 1 != 0 : Timeout waiting for /dev/video0 in test-in...

  • system_tests_guivm_gui_interactive

    • gui_keyboard_layout: unnamed test (unknown)
    • gui_keyboard_layout: Failed (test died)
      # Test died: no candidate needle with tag(s) 'work-xterm, work-xter...
  • system_tests_dispvm

    • system_tests: Fail (unknown)
      Tests qubes.tests.integ.dispvm failed (exit code 1), details report...

    • system_tests: Failed (test died)
      # Test died: Some tests failed at qubesos/tests/system_tests.pm lin...

    • TC_20_DispVM_whonix-workstation-18: test_030_edit_file (failure)
      AssertionError: Timeout waiting for editor window

    • TC_20_DispVM_whonix-workstation-18: test_100_open_in_dispvm (failure)
      AssertionError: './open-file test.txt' failed with ./open-file test...

  • system_tests_guivm_vnc_gui_interactive

    • gui_filecopy: unnamed test (unknown)
    • gui_filecopy: Failed (test died)
      # Test died: no candidate needle with tag(s) 'disp-text-editor' mat...

Unstable tests

Performance Tests

Performance degradation:

11 performance degradations
  • fedora-42-xfce_dom0-vm-api (mean:0.038): 0.46 🔻 ( previous job: 0.39, degradation: 117.35%)
  • dom0_root_seq1m_q8t1_read 3:read_bandwidth_kb: 126495.00 🔻 ( previous job: 365230.00, degradation: 34.63%)
  • dom0_root_seq1m_q1t1_write 3:write_bandwidth_kb: 44521.00 🔻 ( previous job: 130397.00, degradation: 34.14%)
  • dom0_root_rnd4k_q1t1_read 3:read_bandwidth_kb: 10641.00 🔻 ( previous job: 12000.00, degradation: 88.67%)
  • dom0_varlibqubes_rnd4k_q32t1_write 3:write_bandwidth_kb: 8604.00 🔻 ( previous job: 10236.00, degradation: 84.06%)
  • fedora-42-xfce_root_rnd4k_q32t1_read 3:read_bandwidth_kb: 73511.00 🔻 ( previous job: 85360.00, degradation: 86.12%)
  • fedora-42-xfce_root_rnd4k_q32t1_write 3:write_bandwidth_kb: 2222.00 🔻 ( previous job: 3964.00, degradation: 56.05%)
  • fedora-42-xfce_private_seq1m_q8t1_read 3:read_bandwidth_kb: 333516.00 🔻 ( previous job: 371835.00, degradation: 89.69%)
  • fedora-42-xfce_private_rnd4k_q1t1_write 3:write_bandwidth_kb: 269.00 🔻 ( previous job: 517.00, degradation: 52.03%)
  • fedora-42-xfce_volatile_seq1m_q1t1_read 3:read_bandwidth_kb: 308404.00 🔻 ( previous job: 349176.00, degradation: 88.32%)
  • debian-13-xfce_exec: 8.64 🔻 ( previous job: 6.42, degradation: 134.58%)

Remaining performance tests:

85 tests
  • debian-13-xfce_dom0-dispvm-api (mean:6.923): 83.08 🟢 ( previous job: 84.68, improvement: 98.11%)
  • debian-13-xfce_dom0-dispvm-gui-api (mean:7.906): 94.87 🟢 ( previous job: 98.62, improvement: 96.19%)
  • debian-13-xfce_dom0-dispvm-preload-2-api (mean:4.174): 50.09
  • debian-13-xfce_dom0-dispvm-preload-4-api (mean:3.795): 45.54
  • debian-13-xfce_dom0-dispvm-preload-2-gui-api (mean:4.324): 51.88
  • debian-13-xfce_dom0-dispvm-preload-4-gui-api (mean:3.796): 45.55
  • debian-13-xfce_dom0-vm-api (mean:0.035): 0.42 🟢 ( previous job: 0.59, improvement: 71.57%)
  • debian-13-xfce_dom0-vm-gui-api (mean:0.037): 0.44 🟢 ( previous job: 0.58, improvement: 75.82%)
  • fedora-42-xfce_dom0-dispvm-api (mean:7.367): 88.41 🟢 ( previous job: 89.70, improvement: 98.56%)
  • fedora-42-xfce_dom0-dispvm-gui-api (mean:8.423): 101.07 🟢 ( previous job: 107.00, improvement: 94.46%)
  • fedora-42-xfce_dom0-dispvm-preload-2-api (mean:4.706): 56.48
  • fedora-42-xfce_dom0-dispvm-preload-4-api (mean:4.085): 49.01
  • fedora-42-xfce_dom0-dispvm-preload-2-gui-api (mean:4.79): 57.48
  • fedora-42-xfce_dom0-dispvm-preload-4-gui-api (mean:3.953): 47.43
  • fedora-42-xfce_dom0-vm-gui-api (mean:0.042): 0.50 🟢 ( previous job: 0.54, improvement: 93.33%)
  • whonix-workstation-18_dom0-dispvm-api (mean:9.026): 108.31 🟢 ( previous job: 117.52, improvement: 92.17%)
  • whonix-workstation-18_dom0-dispvm-gui-api (mean:10.051): 120.62 🟢 ( previous job: 130.38, improvement: 92.51%)
  • whonix-workstation-18_dom0-dispvm-preload-2-api (mean:5.651): 67.81
  • whonix-workstation-18_dom0-dispvm-preload-4-api (mean:4.956): 59.48
  • whonix-workstation-18_dom0-dispvm-preload-2-gui-api (mean:5.756): 69.07
  • whonix-workstation-18_dom0-dispvm-preload-4-gui-api (mean:4.798): 57.58
  • whonix-workstation-18_dom0-vm-api (mean:0.037): 0.45 🟢 ( previous job: 0.60, improvement: 74.50%)
  • whonix-workstation-18_dom0-vm-gui-api (mean:0.04): 0.48 🔻 ( previous job: 0.45, degradation: 107.13%)
  • dom0_root_seq1m_q8t1_write 3:write_bandwidth_kb: 217276.00 🔻 ( previous job: 231269.00, degradation: 93.95%)
  • dom0_root_seq1m_q1t1_read 3:read_bandwidth_kb: 111872.00 🟢 ( previous job: 110619.00, improvement: 101.13%)
  • dom0_root_rnd4k_q32t1_read 3:read_bandwidth_kb: 75297.00 🟢 ( previous job: 39923.00, improvement: 188.61%)
  • dom0_root_rnd4k_q32t1_write 3:write_bandwidth_kb: 9573.00 🟢 ( previous job: 3264.00, improvement: 293.29%)
  • dom0_root_rnd4k_q1t1_write 3:write_bandwidth_kb: 3708.00 🟢 ( previous job: 2099.00, improvement: 176.66%)
  • dom0_varlibqubes_seq1m_q8t1_read 3:read_bandwidth_kb: 469792.00 🔻 ( previous job: 491827.00, degradation: 95.52%)
  • dom0_varlibqubes_seq1m_q8t1_write 3:write_bandwidth_kb: 202174.00 🟢 ( previous job: 103611.00, improvement: 195.13%)
  • dom0_varlibqubes_seq1m_q1t1_read 3:read_bandwidth_kb: 440393.00 🟢 ( previous job: 432938.00, improvement: 101.72%)
  • dom0_varlibqubes_seq1m_q1t1_write 3:write_bandwidth_kb: 170257.00 🟢 ( previous job: 143368.00, improvement: 118.76%)
  • dom0_varlibqubes_rnd4k_q32t1_read 3:read_bandwidth_kb: 106585.00 🟢 ( previous job: 103977.00, improvement: 102.51%)
  • dom0_varlibqubes_rnd4k_q1t1_read 3:read_bandwidth_kb: 7658.00 🔻 ( previous job: 8300.00, degradation: 92.27%)
  • dom0_varlibqubes_rnd4k_q1t1_write 3:write_bandwidth_kb: 4825.00 🟢 ( previous job: 3936.00, improvement: 122.59%)
  • fedora-42-xfce_root_seq1m_q8t1_read 3:read_bandwidth_kb: 398243.00 🟢 ( previous job: 346866.00, improvement: 114.81%)
  • fedora-42-xfce_root_seq1m_q8t1_write 3:write_bandwidth_kb: 219229.00 🟢 ( previous job: 137220.00, improvement: 159.76%)
  • fedora-42-xfce_root_seq1m_q1t1_read 3:read_bandwidth_kb: 338359.00 🔻 ( previous job: 355690.00, degradation: 95.13%)
  • fedora-42-xfce_root_seq1m_q1t1_write 3:write_bandwidth_kb: 61305.00 🟢 ( previous job: 26931.00, improvement: 227.64%)
  • fedora-42-xfce_root_rnd4k_q1t1_read 3:read_bandwidth_kb: 8324.00 🔻 ( previous job: 8945.00, degradation: 93.06%)
  • fedora-42-xfce_root_rnd4k_q1t1_write 3:write_bandwidth_kb: 1285.00 🟢 ( previous job: 457.00, improvement: 281.18%)
  • fedora-42-xfce_private_seq1m_q8t1_write 3:write_bandwidth_kb: 153833.00 🟢 ( previous job: 108858.00, improvement: 141.32%)
  • fedora-42-xfce_private_seq1m_q1t1_read 3:read_bandwidth_kb: 388937.00 🟢 ( previous job: 351871.00, improvement: 110.53%)
  • fedora-42-xfce_private_seq1m_q1t1_write 3:write_bandwidth_kb: 56742.00 🟢 ( previous job: 49472.00, improvement: 114.70%)
  • fedora-42-xfce_private_rnd4k_q32t1_read 3:read_bandwidth_kb: 73377.00 🔻 ( previous job: 75326.00, degradation: 97.41%)
  • fedora-42-xfce_private_rnd4k_q32t1_write 3:write_bandwidth_kb: 1920.00 🟢 ( previous job: 1909.00, improvement: 100.58%)
  • fedora-42-xfce_private_rnd4k_q1t1_read 3:read_bandwidth_kb: 8683.00 🟢 ( previous job: 8365.00, improvement: 103.80%)
  • fedora-42-xfce_volatile_seq1m_q8t1_read 3:read_bandwidth_kb: 369998.00 🟢 ( previous job: 361328.00, improvement: 102.40%)
  • fedora-42-xfce_volatile_seq1m_q8t1_write 3:write_bandwidth_kb: 178787.00 🟢 ( previous job: 140939.00, improvement: 126.85%)
  • fedora-42-xfce_volatile_seq1m_q1t1_write 3:write_bandwidth_kb: 57286.00 🟢 ( previous job: 46842.00, improvement: 122.30%)
  • fedora-42-xfce_volatile_rnd4k_q32t1_read 3:read_bandwidth_kb: 77312.00 🔻 ( previous job: 79581.00, degradation: 97.15%)
  • fedora-42-xfce_volatile_rnd4k_q32t1_write 3:write_bandwidth_kb: 3094.00 🟢 ( previous job: 2110.00, improvement: 146.64%)
  • fedora-42-xfce_volatile_rnd4k_q1t1_read 3:read_bandwidth_kb: 8297.00 🟢 ( previous job: 7708.00, improvement: 107.64%)
  • fedora-42-xfce_volatile_rnd4k_q1t1_write 3:write_bandwidth_kb: 1677.00 🟢 ( previous job: 861.00, improvement: 194.77%)
  • debian-13-xfce_exec-root: 27.19 🟢 ( previous job: 27.48, improvement: 98.95%)
  • debian-13-xfce_socket: 8.57 🟢 ( previous job: 8.68, improvement: 98.78%)
  • debian-13-xfce_socket-root: 8.30 🟢 ( previous job: 8.40, improvement: 98.76%)
  • debian-13-xfce_exec-data-simplex: 64.30 🟢 ( previous job: 67.70, improvement: 94.98%)
  • debian-13-xfce_exec-data-duplex: 69.38 🔻 ( previous job: 69.20, degradation: 100.26%)
  • debian-13-xfce_exec-data-duplex-root: 79.28 🟢 ( previous job: 86.64, improvement: 91.50%)
  • debian-13-xfce_socket-data-duplex: 133.93 🟢 ( previous job: 136.31, improvement: 98.25%)
  • fedora-42-xfce_exec: 9.13 🟢 ( previous job: 9.23, improvement: 98.97%)
  • fedora-42-xfce_exec-root: 59.97 🔻 ( previous job: 59.89, degradation: 100.13%)
  • fedora-42-xfce_socket: 8.71 🔻 ( previous job: 8.23, degradation: 105.83%)
  • fedora-42-xfce_socket-root: 8.35 🔻 ( previous job: 8.31, degradation: 100.43%)
  • fedora-42-xfce_exec-data-simplex: 67.18 🔻 ( previous job: 66.18, degradation: 101.51%)
  • fedora-42-xfce_exec-data-duplex: 65.21 🟢 ( previous job: 71.51, improvement: 91.20%)
  • fedora-42-xfce_exec-data-duplex-root: 97.95 🔻 ( previous job: 96.52, degradation: 101.48%)
  • fedora-42-xfce_socket-data-duplex: 135.23 🟢 ( previous job: 137.78, improvement: 98.15%)
  • whonix-gateway-18_exec: 7.79 🟢 ( previous job: 8.52, improvement: 91.42%)
  • whonix-gateway-18_exec-root: 130.91 🔻 ( previous job: 129.05, degradation: 101.45%)
  • whonix-gateway-18_socket: 7.73 🟢 ( previous job: 8.26, improvement: 93.52%)
  • whonix-gateway-18_socket-root: 7.68 🟢 ( previous job: 7.76, improvement: 98.98%)
  • whonix-gateway-18_exec-data-simplex: 72.89 🟢 ( previous job: 73.13, improvement: 99.68%)
  • whonix-gateway-18_exec-data-duplex: 72.93 🔻 ( previous job: 71.98, degradation: 101.32%)
  • whonix-gateway-18_exec-data-duplex-root: 140.32 🟢 ( previous job: 146.68, improvement: 95.67%)
  • whonix-gateway-18_socket-data-duplex: 139.96 🟢 ( previous job: 142.71, improvement: 98.07%)
  • whonix-workstation-18_exec: 8.62 🟢 ( previous job: 8.85, improvement: 97.42%)
  • whonix-workstation-18_exec-root: 146.45 🟢 ( previous job: 149.84, improvement: 97.74%)
  • whonix-workstation-18_socket: 9.01 🟢 ( previous job: 9.12, improvement: 98.79%)
  • whonix-workstation-18_socket-root: 8.61 🟢 ( previous job: 8.75, improvement: 98.47%)
  • whonix-workstation-18_exec-data-simplex: 69.71 🔻 ( previous job: 67.02, degradation: 104.02%)
  • whonix-workstation-18_exec-data-duplex: 71.39 🔻 ( previous job: 69.25, degradation: 103.09%)
  • whonix-workstation-18_exec-data-duplex-root: 149.28 🔻 ( previous job: 146.32, degradation: 102.02%)
  • whonix-workstation-18_socket-data-duplex: 132.20 🟢 ( previous job: 139.15, improvement: 95.00%)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants