I upgraded from KDE 4 to 5, and now kwin is frame skipping once or twice per second when watching 60FPS video. The problem is reproducible with glxgears when forcing vsync. I'm using the nvidia drivers, and vsync can be forced on with: __GL_SYNC_TO_VBLANK=1 glxgears The animation will stutter once or twice per second. kwin 4 did not have the issue. This problem makes 60FPS video stuttery. Watching TV is now not a good experience due to that. Disabling vsync in System Settings makes the issue disappear. But, there's tearing when doing that, which is not acceptable at all. The problem is there with both the OpenGL 2.1 and 3.0 backends. My system: GPU: NVidia GTX 780 CPU: Core i5 2500K 3.2GHz RAM: 16GB DDR3 1600Mhz NVidia driver: 355.06 (the problem is there will all previous versions too.) X-Org server: 1.17.2 OS: Gentoo Linux AMD64 Reproducible: Always
please attach qdbus org.kde.KWin /KWin supportInformation have a look at the refresh rate setting In general, this is likely bug #344433
Created attachment 94211 [details] qdbus org.kde.KWin /KWin supportInformation Attached. The bug you linked to seems to be a triple buffering issue? I disabled triple buffering here and the problem persists. Is this still a duplicate?
No, I take that back. After I started kwin with: __GL_YIELD="USLEEP" kwin_x11 --replace & The problem went away. However, now I have *INSANE* input lag in everything. From moving windows to typing.
Meh, KWin too old (of course, silly me) Run (from konsole) kwin_x11 --replace 2>&1 | grep -i refresh & and check the output. The other bug is primarily a bug on a wrong refresh rate reported by the nvidia driver through xrandr - if kwin thinks you're operating @50Hz but you're on 60Hz, we update too rarely, constantly loosing frames.
kwin_core: Vertical Refresh rate 60 Hz ( "primary screen" ) However, if I switch my monitor to 50Hz, there's no problem at all. No skipping in glxgears. 60FPS video will of course not look good at 50Hz, so that's no solution.
I updated to kwin 5.4.0 and the problem persists. Also, I just tested with triple buffering enabled, but with __GL_YIELD="USLEEP" set for kwin_x11. This also fixes the problem. So to sum it up: triple buffering enabled or disabled without __GL_YIELD="USLEEP" results in frame skipping. Setting __GL_YIELD="USLEEP" fixes frame skipping regardless of whether triple buffering is enabled or not. Can we mark this as a duplicate now, or is this a separate issue?
Double buffering w/o yielding usleep will cause kwin to turn v'sync off (triple buffering will allow v'sync regardless) Unsync'd painting @60Hz while the content syncs to screen @60Hz should however not directly leada to a frame loss. What about the input lag? What happens on export KWIN_EXPLICIT_SYNC=0 export __GL_YIELD=USLEEP kwin_x11 --replace &
(In reply to Thomas Lübking from comment #7) > Double buffering w/o yielding usleep will cause kwin to turn v'sync off > (triple buffering will allow v'sync regardless) Hm, if I go into system setting and toggle vsync (disable it, hit "apply", then enable it and hit "apply") then vsync is enabled even though triple buffering in the nvidia driver is disabled. However, frame stutter is back when doing that. Which is a new observation; there's frame skipping even with double buffering? > What about the input lag? I don't know anymore. It seems random. If I simply restart kwin with__GL_YIELD=USLEEP set and with double buffering, there's input lag. When dragging a window, for example, the mouse cursor moves, but the window follows by what seems 70-100ms later. But when I then go into system settings, and toggle vsync off/on again, the lag disappears. With triple buffering enabled and __GL_YIELD=USLEEP set, there's no input lag (or rather there is, but it's very small. The mouse cursor is ahead of the window by only a few pixels, rather than 1/4 of the screen.) > What happens on > export KWIN_EXPLICIT_SYNC=0 > export __GL_YIELD=USLEEP > kwin_x11 --replace & KWIN_EXPLICIT_SYNC=0 seems to have no effect on anything, both with triple buffering enabled and disabled.
Btw, if anyone else is reading this, the way I've set it up right now is to have triple buffering enabled*, and have an executable script named /usr/local/bin/kwin_x11 with this in it: #! /bin/sh __GL_YIELD="USLEEP" exec /usr/bin/kwin_x11 $* This gets rid of stutter. * normally I would use triple buffering disabled, by I gave up entirely on gaming under Linux and I use a Windows installation instead, so now I have it enabled to get a good desktop experience.
Sounds rather like (related to) bug #343184 If kwin thinks you're triple buffering while you're actually double buffering, not only vsync remains enabled, but it will notably wait too long between two framebuffer swaps. We certainly need to figure a more reliable way to detect this but to measure swap times (which has increasingly proven unreliable lately)
Contrary to that bug report, when not using TB, on login vsync is disabled and suspend/unsuspend compositing doesn't enable vsync. Going to system settings and toggling it there does enable it.
More trouble: If I switch my monitor to 50Hz (for 50FPS video, PAL Europe), then kwin still thinks the refresh rate is 60Hz: $ qdbus org.kde.KWin /KWin supportInformation | grep -i refre refreshRate: 0 Refresh Rate: 60 However, there is no frame skipping. Everything is smooth. If I logout and in again, kwin correctly detects that the refresh rate is 50Hz, but there's extreme frame skipping (also in glxgears), and video is pretty much unwatchable.
I have the same issue using an Intel HD5500 card. None of the tricks work for me - except for disabling composition.
I don't know if it is interesting, but the kwin FPS messurement graphi shows 56 fps while my glxgears fluctuates showing every possible number between 60 and 160.
the output is unsync'd - glxgears paints as fast as it can, kwin is capped at 60 fps (by default) I assume you're using the vesa driver (software emulation) and not facing this bug? 160 fps in glxgears is incredibly little (even for an intel IGP ;-) and you're likely loosing frames because the system simply cannot keep the pace. check /var/Xorg.0.log and glxgears on this. Also "qdbus org.kde.KWin /KWin supportInformation" will likely reveal xrender compositing?
I'll give at little more info - then maybe you can tell me if it's a different bug. The characteristics is like this. With compositor disabled I get 60 Hz glxgears. Using any of the other compositors I get this fluctuating frame rate. Hardware acceleration works. At least glxinfo say so. Running glgear with vblank_mode=0 gives me around 8000 fps and a 1 maxed out cpu core. Is this a different bug?
I'm not sure ;-) The sync'd glxgears behavior is strange, but the undercut kwin sync rate of course means frame skipping. Because of comment #12, let's try to simply raise the assumed refreshrate. In ~/.config/kwinrc add to the Compositing section [Compositing] # other setting RefreshRate=70 MaxFPS=70 run "kwin_x11 --replace &" and see what happens. Also attach the ouput of qdbus org.kde.KWin /KWin supportInformation from before and after the config change.
Wow - it actually works. I've done some fiddling with the values and experiments. * It does not work when both values is set to 60 * Both values have to be set. * Moving the glxgear window - or any other window around increase the framerate. So it only works for a static window system.
Created attachment 94429 [details] Before adding MaxFPS070 and RefreshRate=70
Created attachment 94430 [details] After adding MaxFPS070 and RefreshRate=70
(In reply to Søren Holm from comment #18) > * Both values have to be set. They're merged, so that's not much of a surprise. > * Moving the glxgear window - or any other window around increase the > framerate. Framerate of KWin or glxgears? Can you please a) attach the output of glxinfo b) run "kcmshell5 kwincompositing" and set the tearing prevention to "full scene repaints"? (do *not* even try to "copy frontbuffer")
glxgears framerate. Setting composition to "full scene paint" clams the glxgears framerate to around 60 - which menas that it sort of "solves" the issue.
Created attachment 94461 [details] My glxinfo
Can anybody try https://git.reviewboard.kde.org/r/125659/ on double buffered compositing?
(In reply to Thomas Lübking from comment #24) > Can anybody try https://git.reviewboard.kde.org/r/125659/ on double buffered > compositing? Without __GL_YIELD="USLEEP": Heavy stutter. Also tearing starts when starting glxgears. Before starting glxgears, vsync works. After starting it, vsync breaks and a tear line is moving from bottom to top (indicating that the framerate does not match the refresh rate but is a tad higher.) KWin still reports "efresh Rate: 60" though. With __GL_YIELD="USLEEP": No stutter, no tearing.
Thanks for testing. W/o usleep, vsync is disabled, this is unrelated to glxgears and expectable - but it should not "stutter" (whatever that means in this context) as glxSwapBuffers does *never* block. Can you try this patch on top of the other patch and see whether double buffering WITHOUT __GL_YIELD="USLEEP" still stutters (skips frames?) diff --git a/glxbackend.cpp b/glxbackend.cpp index a639acc..dfe1f08 100644 --- a/glxbackend.cpp +++ b/glxbackend.cpp @@ -632,7 +632,8 @@ void GlxBackend::present() m_swapProfiler.begin(); } glXSwapBuffers(display(), glxWindow); - glXWaitGL(); + if (blocksForRetrace()) + glXWaitGL(); if (gs_tripleBufferNeedsDetection) { if (char result = m_swapProfiler.end()) { gs_tripleBufferUndetected = gs_tripleBufferNeedsDetection = false;
(In reply to Thomas Lübking from comment #26) > but it should not "stutter" (whatever that means in this context) as > glxSwapBuffers does *never* block. Well, it stutters. Or rather, "microstutters". Looks like this: http://www.testufo.com/#test=stutter&demo=microstuttering&foreground=FFFFFF&background=000000&max=12&pps=720 > Can you try this patch on top of the other patch and see whether double > buffering WITHOUT __GL_YIELD="USLEEP" still stutters (skips frames?) Yep, no change. It still stutters.
(In reply to Nikos Chantziaras from comment #27) > Well, it stutters. Or rather, "microstutters". Let me clarify that: if I disable vsync, with *without* __GL_YIELD="USLEEP", it stutters just the same way as with vsync on. Except that now there's tearing on top of that. If I disable vsync but start kwin *with* __GL_YIELD="USLEEP", then there's zero stutters, but still tearing (obviously.) vsync doesn't seem to cause or otherwise affect the stuttering. Only __GL_YIELD="USLEEP" does.
Goddamit, this is complicated :-P I need to clarify further: In all cases where there's stutter, vsync with *kwin* does not seem to affect it, except in the __GL_YIELD="USLEEP" case with vsync disabled in kwin and also disabled in the GL app. vsync with the GL application *will* affect it. For example, starting glxgears *without* __GL_SYNC_TO_VBLANK=1will not produce any stutter. This is getting complicated, so here's a summary: __GL_YIELD="USLEEP" NOT set for kwin: vsync ON in kwin, __GL_SYNC_TO_VBLANK=1 in glxgears -> stutter vsync ON in kwin, __GL_SYNC_TO_VBLANK=0 in glxgears -> smooth vsync OFF in kwin, __GL_SYNC_TO_VBLANK=1 in glxgears -> stutter vsync OFF in kwin, __GL_SYNC_TO_VBLANK=0 in glxgears -> smooth __GL_YIELD="USLEEP" SET for kwin: vsync ON in kwin, __GL_SYNC_TO_VBLANK=1 in glxgears -> smooth vsync ON in kwin, __GL_SYNC_TO_VBLANK=0 in glxgears -> smooth vsync OFF in kwin, __GL_SYNC_TO_VBLANK=1 in glxgears -> smooth vsync OFF in kwin, __GL_SYNC_TO_VBLANK=0 in glxgears -> stutter
(In reply to Nikos Chantziaras from comment #29) > This is getting complicated, so here's a summary: > > __GL_YIELD="USLEEP" NOT set for kwin: > vsync ON in kwin, __GL_SYNC_TO_VBLANK=1 in glxgears -> stutter > vsync ON in kwin, __GL_SYNC_TO_VBLANK=0 in glxgears -> smooth > vsync OFF in kwin, __GL_SYNC_TO_VBLANK=1 in glxgears -> stutter > vsync OFF in kwin, __GL_SYNC_TO_VBLANK=0 in glxgears -> smooth The two sets are equal, because kwin isn't syncing in either case. What will happen is that the sync'd glxgears "shifts" against the randomly but fix-clocked kwin repaint - every now and then a glxgears frame is too early or to for kwin's internal clockrate (after all, nothing is synced here) Increasing RefreshRate and MaxFPS in kwin should "fix" that. > __GL_YIELD="USLEEP" SET for kwin: > vsync ON in kwin, __GL_SYNC_TO_VBLANK=1 in glxgears -> smooth > vsync ON in kwin, __GL_SYNC_TO_VBLANK=0 in glxgears -> smooth Good for KWin ;-) It would seem that you should get tearing in glxgears (but since it repaints a some thousand FPS, you would not see a strong tearing offset - the inter-frame differences are just low) > vsync OFF in kwin, __GL_SYNC_TO_VBLANK=1 in glxgears -> smooth This is interesting, as the same conditions produced "stutter" above - the relaxed scheduling in KWin might help, but we're on voodoo here. > vsync OFF in kwin, __GL_SYNC_TO_VBLANK=0 in glxgears -> stutter In contrast, the unsynced (painting as fast as possible) glxgears might "steal" too much cpu slices from kwin here - you might want to try __GL_YIELD="USLEEP" for glxgears as well (or globally)
Git commit 8bea96d7018d02dff9462326ca9456f48e9fe9fb by Thomas Lübking. Committed on 11/11/2015 at 21:18. Pushed by luebking into branch 'master'. wait for GL after swapping otherwise at least on the nvidia blob the swapping doesn't block even for double buffering REVIEW: 125659 Related: bug 346275 FIXED-IN: 5.5 M +4 -0 glxbackend.cpp http://commits.kde.org/kwin/8bea96d7018d02dff9462326ca9456f48e9fe9fb
I applied this on top of 5.4.3. What exactly does this fix though? The frame skipping is still there.
Fixes bug #346275, just wanted to get you aware of the change. The behavior is mostly expectable ("WONTFIX") and explained in comment #30 The only (but rather pleasant) surprise is > vsync OFF in kwin, __GL_SYNC_TO_VBLANK=1 in glxgears -> smooth
(In reply to Thomas Lübking from comment #33) > Fixes bug #346275, just wanted to get you aware of the change. > > The behavior is mostly expectable ("WONTFIX") and explained in comment #30 So you're saying kwin is supposed to stutter out-of-the-box for all nvidia users? :-/
If you turn off sync'ing and run 60FPS (sync'd glxgears) against 60FPS (kwin clock) you get microstutter, that has nothing to do with the GPU. We had "repaint as fast as possible" (was a bug in some 4.0 release) and got massive complaints on that, so we're not again, no - kwin isn't an egoshooter; there's usually better thigs to waste the CPU on. The "out of the box issue" is that you don't get syncing w/o usleep yielding or triple buffering. That is bug #322060 - we might release that in case the nvidia blob remains on the unblocked swap (no block, no wait, no cpu load), but that's something to learn. It's also an out of the box issue of mostly distros or the nvidia blob (default behavior of the nvidia driver) Because "actual" kwin is a library, we've trouble to set the environment "in time" and "when required" (thus the wrapper script on the other bug, idea not accepted in the past)
(In reply to Thomas Lübking from comment #35) > If you turn off sync'ing and run 60FPS (sync'd glxgears) against 60FPS (kwin > clock) you get microstutter, that has nothing to do with the GPU. > > We had "repaint as fast as possible" (was a bug in some 4.0 release) and got > massive complaints on that, so we're not again, no - kwin isn't an > egoshooter; there's usually better thigs to waste the CPU on. kwin affects anything you see. For example, TV is borderline unwatchable with kwin without usleep. It's really, *really* bad. Meanwhile on Windows 7, 8, and 10: *perfect* image. Never drops a single frame. You can even play video games in borderless windowed mode without issues (normally exclusive fullscreen mode is used for games.) Is Windows and its compositor so fundamentally different? Can reliability like this not be achieved with Linux and OpenGL? > The "out of the box issue" is that you don't get syncing w/o usleep yielding > or triple buffering. Note that it's not an or-relation. Without usleep, triple buffering doesn't help at all. It skips frames just the same. I think it might be beneficial to introduce a kwin loader binary that sets usleep and then execve()'s the actual binary. Right now, we're on the way to establishing a "yeah, kwin is crappy like that, use something else if you want smooth video" mentality.
(In reply to Nikos Chantziaras from comment #36) > Meanwhile on Windows 7, 8, and 10: *perfect* image. Never drops a single > frame. Yes, ensure syncing. > You can even play video games in borderless windowed mode without > issues (normally exclusive fullscreen mode is used for games.) That sounds like fullscreen unredirection? I thought it's w/o trouble in general? > Is Windows and its compositor so fundamentally different? Actually yes. But that's not relevant in /this/ regard. > Note that it's not an or-relation. Without usleep, triple buffering doesn't > help at all. It skips frames just the same. Tha rather sounds like misdetected (bug #343184) - otherwise it would cross you former description. Personally, I do not oppose to make upstream kwin a script, notably given the env list we have: https://community.kde.org/KWin/Environment_Variables
(In reply to Thomas Lübking from comment #37) > > You can even play video games in borderless windowed mode without > > issues (normally exclusive fullscreen mode is used for games.) > > That sounds like fullscreen unredirection? > I thought it's w/o trouble in general? kwin? Yeah, it works. But either for every window or for none. You can't have games use exclusive fullscreen and then, say, media players and browsers use normal borderless window mode. You can't differentiate between fullscreen windows that should be composited and "true" fullscreen apps that should not be composited, like you can on Windows. > > Note that it's not an or-relation. Without usleep, triple buffering doesn't > > help at all. It skips frames just the same. > > Tha rather sounds like misdetected (bug #343184) - otherwise it would cross > you former description. I use TB, but starting kwin with: KWIN_TRIPLE_BUFFER=1 /usr/bin/kwin_x11 --replace & still stutters.
(In reply to Nikos Chantziaras from comment #38) > kwin? Yeah, it works. No, I meant windows (since you stressed "borderless windowed mode") > But either for every window or for none. The unredirection approach is stupid anyway, since the resources are still used. http://kde-look.org/content/show.php/GameMode?content=156659 > I use TB, but starting kwin with: > KWIN_TRIPLE_BUFFER=1 /usr/bin/kwin_x11 --replace & > still stutters. That would be more than odd -> in what context? (ie. you're using kwin v'sync and ...?... to cause that, in case of glxgears, synced or not? Are non GL clients affected as well?) The default is sched_yield() which is NOOP (thus the usleep demand); it would seem this drains too much time from the painting client? However, latest nvidia drivers do, as mentioned, not wait at all (and should not on actual triple buffering in any version), so *right now* the value should have zero impact...
(In reply to Thomas Lübking from comment #39) > (In reply to Nikos Chantziaras from comment #38) > > > kwin? Yeah, it works. > No, I meant windows (since you stressed "borderless windowed mode") Oh. In that case, I don't understand the question :-P If you run a game in real fullscreen mode, there's no compositing and the game gets full control of the screen (it can set whatever resolution and refresh rate it wants without affecting the desktop.) In windowed mode, the compositor is active and the game goes through it. However, the Windows compositor is so reliable, that the game runs just as well in windowed mode as it does in real fullscreen. (Borderless window mode is not real fullscreen. The game does not control the screen. It just runs in a normal window, just without distracting window borders. But it's not really fullscreen.) > > I use TB, but starting kwin with: > > KWIN_TRIPLE_BUFFER=1 /usr/bin/kwin_x11 --replace & > > still stutters. > > That would be more than odd -> in what context? > (ie. you're using kwin v'sync and ...?... to cause that, in case of > glxgears, synced or not? Are non GL clients affected as well?) > > The default is sched_yield() which is NOOP (thus the usleep demand); it > would seem this drains too much time from the painting client? > However, latest nvidia drivers do, as mentioned, not wait at all (and should > not on actual triple buffering in any version), so *right now* the value > should have zero impact... Oh. Will have to upgrade the drivers at some point then. I'm on 355.11 right now (previous release) because kwin displays pixel garbage with 358.09 (there's just random pixels on the screen). Will have to try again at some point.
(In reply to Nikos Chantziaras from comment #40) > If you run a game in real fullscreen mode, there's no compositing So windows performs some fullscreen unredirection ;-) > I'm on 355.11 right now That's "latest" enough, though I cannot guarantee the behavior for all GPUs. However if you've TB enabled (check Xorg.0.log) and overridden the KWin detection, I can hardly imagine the impact of the yielding strategy.... do you use some aurorae deco (incl. plastik) or still breeze? Do you export __GL_YIELD globally or only for kwin (since globally, I will also impact the desktop and each any every other QtQuick using process)
(In reply to Thomas Lübking from comment #41) > (In reply to Nikos Chantziaras from comment #40) > > If you run a game in real fullscreen mode, there's no compositing > > So windows performs some fullscreen unredirection ;-) AFAIK, it also frees GPU resources allocated to the compositor. You get more free VRAM, for example. In any event, it's very clear that something happens behind the scenes, since the monitor loses signal for a tiny moment as Windows disables compositing and other desktop stuff, and this also happens every time you alt+tab to the desktop and back. (This "real fullscreen" mode is a purely gaming-oriented thing, btw. It's not very useful for anything else other than making video games run as fast as possible.) > > I'm on 355.11 right now > That's "latest" enough, though I cannot guarantee the behavior for all GPUs. > However if you've TB enabled (check Xorg.0.log) and overridden the KWin > detection, I can hardly imagine the impact of the yielding strategy.... do > you use some aurorae deco (incl. plastik) or still breeze? Nope. Just Breeze. It's a vanilla KDE installation (I'm using Gentoo, so the KDE packages are not even "distro-themed" or anything. They're plain vanilla KDE.) > Do you export __GL_YIELD globally or only for kwin (since globally, I will > also impact the desktop and each any every other QtQuick using process) I only set it for kwin. It's a script (/usr/local/bin/kwin) that gets called instead of /usr/bin/kwin since /usr/local/bin comes first in PATH. There's just one line in the script: __GL_YIELD="USLEEP" exec /usr/bin/kwin_x11 $*
(In reply to Nikos Chantziaras from comment #42) > It's a script (/usr/local/bin/kwin) that gets called > instead of /usr/bin/kwin since /usr/local/bin comes first in PATH. Typo. "/usr/local/bin/kwin_x11", not "kwin".
(In reply to Nikos Chantziaras from comment #42) > AFAIK, it also frees GPU resources allocated to the compositor. That's what the compositing blocking - window property - kwin rule - game mode script do (fullscreen or not) > since the monitor loses signal for a tiny moment as Well, we can do w/o that :-P > Nope. Just Breeze. > I only set it for kwin. Then i'm momentarily out of ideas where yielding would impact... (maybe on general glFlush'es)
Video now went from borderline unwatchable to actually unwatchable after a KDE Plasma update (going to kwin 5.5.2. I also upgraded my GPU to a GTX 980 Ti. The frame skipping in video is now so extreme, you can see frames flickering back and forward all the time. Doesn't matter if I watch a video with mpv or with Chrome's HTML5 video player on YouTube. It is so bad where I'm now looking at how to switch my system from KDE to something else, after 17 years of KDE use :-(
I beg your pardon? Did anything else change compared to comments #29 and following?
Yes. It now has extreme frame skipping even with __GL_YIELD="USLEEP" and TB.
That's pretty interesting as we didn't change anything itr (except for the patch in comment #31, which you already tried and which doesn't apply on TB anyway) What happens if you raise the desired swap frequency (as in comment #17, feel free to try even higher values)
PS, does the same problem also occur for glxgears (or is it limited to video playback)
> What happens if you raise the desired swap frequency (as in comment #17, feel free to try even higher values) OK, I tried 70 and it works! No stutter, but there's input lag. I've tried 61 and it works too, and without the input lag of 70. I then tried 60.1, 60.2, 60.9, but none work. It doesn't like fractional numbers (which is weird, because some monitor modes ARE inherently fractional; like 47.952Hz (for 23.976FPS video content.) > PS, does the same problem also occur for glxgears (or is it limited to video playback) Hm, glxgears doesn't seem to suffer. It's only video that does. But thank you for pointing out comment #17 again. I somehow missed it completely. You just saved me hours of messing around with installation and setup of other window managers :-)
Btw, since nothin
Btw, since nothing changed in kwin, this has to be a side effect of the GPU upgrade? I wonder if rendering speed matters. I had a 780 before, which is already very fast GPU, and the 980 Ti I have now is even faster than that. Could it be that there's some bug that only triggers when rendering times/latency are very low?
Well, turns out 61 doesn't work for 1080p video. Or even slightly higher values (62, 63). It works only for lower resolutions. I've set MaxFPS=70 and RefreshRate=70 to get stutter-free 1080p@60FPS. The driver version doesn't seem to matter, btw. 352.63, 355.11 and 358.16 all exhibit the problem. I didn't try 361.x versions yet, because they seem to break KDE and a lot of Qt applications so I'll have to wait for things to be sorted out first.
Closing. I don't care anymore.