Summary: | VRR below 30-ish FPS turns off completely | ||
---|---|---|---|
Product: | [Plasma] kwin | Reporter: | fililip <team> |
Component: | platform-drm | Assignee: | KWin default assignee <kwin-bugs-null> |
Status: | CONFIRMED --- | ||
Severity: | normal | CC: | eskot98, team, tinozzo123, xaver.hugl |
Priority: | NOR | Keywords: | qt6 |
Version First Reported In: | 6.0.1 | ||
Target Milestone: | --- | ||
Platform: | Arch Linux | ||
OS: | Linux | ||
Latest Commit: | Version Fixed In: | ||
Sentry Crash Report: |
Description
fililip
2024-03-11 14:11:11 UTC
I should probably add that using the hardware cursor works fine (I was using the software one), but I'd personally prefer to have the option to retain even a low software cursor refresh rate anyway, or at least not have this behavior at all when I'm not moving the mouse. This is unfortunately intentional for now; it's indeed to avoid brightness flicker with many displays at such low refresh rates, and to work around driver bugs and API limitations with the hardware cursor.
This limitation will be removed again (or at least changed to a lower minimum refresh rate) once LFC is implemented in KWin instead of the driver.
> Is there a way to override this behavior with some kind of envvar?
No, this is not adjustable.
Ok, thanks for the quick response! I appreciate it. I'll stick to the HW cursor for now then. One note I'll add: after resuming from suspend, even when using the HW cursor, VRR still breaks below 30 FPS and requires a restart of KWin. Is this just random behavior or a bug? I managed to add an option to override this behavior with an environment variable, LFC finally works as it did before. diff --git a/src/core/renderloop.cpp b/src/core/renderloop.cpp index 5e3a74c..de7b7d3 100644 --- a/src/core/renderloop.cpp +++ b/src/core/renderloop.cpp @@ -201,10 +201,12 @@ void RenderLoop::scheduleRepaint(Item *item) if (d->pendingRepaint) { return; } + bool isEnvVarSet = false; + const bool forceKernelLFC = qEnvironmentVariableIntValue("KWIN_FORCE_KERNEL_LFC", &isEnvVarSet) != 0 && isEnvVarSet; const bool vrr = d->presentationMode == PresentationMode::AdaptiveSync || d->presentationMode == PresentationMode::AdaptiveAsync; if (vrr && workspace()->activeWindow() && d->output) { Window *const activeWindow = workspace()->activeWindow(); - if (activeWindow->isOnOutput(d->output) && activeWindow->surfaceItem() && item != activeWindow->surfaceItem() && activeWindow->surfaceItem()->frameTimeEstimation() <= std::chrono::nanoseconds(1'000'000'000) / 30) { + if (activeWindow->isOnOutput(d->output) && activeWindow->surfaceItem() && item != activeWindow->surfaceItem() && !forceKernelLFC && activeWindow->surfaceItem()->frameTimeEstimation() <= std::chrono::nanoseconds(1'000'000'000) / 30) { return; } } Update: this is a problem only once https://bugs.kde.org/show_bug.cgi?id=485425 happens. Kernel LFC works fine until then. (This is why it's not that obviously noticeable, since I don't run such Wine/X11 programs every session.) In fact, the patch I sent above makes matters even worse - VRR is completely disabled if the situation above occurs. This means that the bug #485425 is related to this one. (Is it possible to mark it as such?) |