Kwin OpenGL compositing in KDE 4.12.0 causes my KDE desktop to slow or freeze soon (10-20 seconds) after enabling with Ctrl-Alt-F12 the desktop effects. I'm using NVidia proprietary driver 331.20 x64. If I run ksysguard there is no increase in CPU (AMD Phenom 2 1090T 3.2GHZ 6 core) or RAM (8GB DDR3) use, but I hear the fans on my GPU GeForce GTX 560Ti 1GB spool up a little almost immediately after the performance begins to slow. The same hardware and NVidia driver worked very well with KDE 4.9.5, and full OpenGL compositing enabled.
please provide the output of "qdbus org.kde.kwin /KWin supportInformation" (ideally with GL compositing enabled)
Also try whether setting tearing prevention to "none" ("kcmshell4 kwincompositing", 3rd tab) and exporting KWIN_USE_BUFFER_AGE=0 has any impact.
> and exporting KWIN_USE_BUFFER_AGE=0 has any
That's not yet in a release, isn't it? It's for 4.11.5 and that's not yet
released. 4.12.0 doesn't exist in kde-workspace, so at max it's 4.11.4.
Output of "qdbus org.kde.kwin /KWin supportInformation" follows...
KWin version: 4.11.4
KDE SC version (runtime): 4.12.0
KDE SC version (compile): 4.12.0
Qt Version: 4.8.5
Active screen follows mouse: no
Number of Screens: 1
Screen 0 Geometry: 0,0,2560x1440
Current Plugin: kwin3_oxygen
Announces Alpha: yes
Frame Overlap: no
Blur Behind: no
Qt Graphics System: raster
Compositing is active
Compositing Type: OpenGL
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce GTX 560 Ti/PCIe/SSE2
OpenGL version string: 4.4.0 NVIDIA 331.20
OpenGL shading language version string: 4.40 NVIDIA via Cg compiler
Driver version: 331.20
GPU class: GF100
OpenGL version: 4.4
GLSL version: 4.40
X server version: 1.14.3
Linux kernel version: 3.11.6
Direct rendering: yes
Requires strict binding: no
GLSL shaders: yes
Texture NPOT support: yes
Virtual Machine: no
OpenGL 2 Shaders are used
Painting blocks for vertical retrace: no
Currently Active Effects:
(In reply to comment #2)
> That's not yet in a release, isn't it?
No, sorry. I assumed 4.12.0 would equal 4.11.5
Please try setting the tearing prevention to "none" nevertheless (that's gonna work ;-) and also to deactivate the blur effect.
Please ensure to restart "kwin --replace &" after altering the tearing prevention mode.
Next thing (even if that helps!) would be to check whether triple buffering is enabled:
grep -i triple /var/log/Xorg.0.log
And if not try results with:
kwin --replace &
(In reply to comment #4)
> (In reply to comment #2)
> > That's not yet in a release, isn't it?
> No, sorry. I assumed 4.12.0 would equal 4.11.5
> Please try setting the tearing prevention to "none" nevertheless (that's
> gonna work ;-) and also to deactivate the blur effect.
> Please ensure to restart "kwin --replace &" after altering the tearing
> prevention mode.
I'm not sure how to say this, but after not working since I installed openSUSE 13.1 over 4 days ago, OpenGL compositing in KDE seems to be working now. Or at least it has not caused any slow down or lock up since last night around when I uploaded the qdbus command output - it has also survived 2 cold reboots I did as a test this morning.
Blur is running, and tearing prevention is set to 'auto', and performance is as good as it was under KDE 4.9.5, which with my GeForce GTX 560Ti was very good...
> Next thing (even if that helps!) would be to check whether triple buffering
> is enabled:
> grep -i triple /var/log/Xorg.0.log
No mention of triple buffering in Xorg.0.log.
> And if not try results with:
> export __GL_YIELD="USLEEP"
> kwin --replace &
I'll keep monitoring the performance, and if it slows again I'll look at usleep and other options, and report back to here.
Thanks guys, although I have no idea what actually fixed it - maybe it fixed itself somehow...
Chris W, NZ.
Ok, see this long bug #322060
Basically, when triple buffering is not available and usleep yielding not set, tearing prevention will be disabled.
Since there's no legal way to detect triple buffering, this might fail, leading kwin to believe it could swap for free, thus becoming rather sloppy about frame time calculation, running into long expensive waits for the swap interval.
Consider enabling triple buffering (bug #322060 comment #56) or exporting usleep yielding (see eg. script attached to bug #322060)
If the issue re-occurs any time, just re-open the bug.
It was nice while it lasted, but then the issue came back a short while ago.
(In reply to comment #6)
> Ok, see this long bug #322060
> Basically, when triple buffering is not available and usleep yielding not
> set, tearing prevention will be disabled.
> Since there's no legal way to detect triple buffering, this might fail,
> leading kwin to believe it could swap for free, thus becoming rather sloppy
> about frame time calculation, running into long expensive waits for the swap
> Consider enabling triple buffering (bug #322060 comment #56) or exporting
> usleep yielding (see eg. script attached to bug #322060)
> If the issue re-occurs any time, just re-open the bug.
I enabled triple buffer in xorg.conf, and after restarting X did "export __GL_YIELD="USLEEP" - that worked when I restarted kwin, so now I need to figure out where to add the __GL_YIELD command so it works on booting, any best suggestions? ;-)
Chris W, NZ
> > If the issue re-occurs any time, just re-open the bug.
> I enabled triple buffer in xorg.conf, and after restarting X did "export
> __GL_YIELD="USLEEP" - that worked when I restarted kwin, so now I need to
> figure out where to add the __GL_YIELD command so it works on booting, any
> best suggestions? ;-)
I answered my own question - I found a site that recommended adding it to /etc/profile, but the openSUSE version of that config file recommended strongly that anything to be put in there be added to /etc/profile.local instead, which I've now done. So after a reboot everything seems to be running fine.
I'll keep you posted to see if it sticks...
Chris W, NZ.
the dupe has a script attached that starts kwin and allows you to alter various env settings.
please notice that exporting the yield strategy globally will impact every gl application - games may reply that by dropping fps.
also yielding adjustment should not be necessary when triple buffering is enabled (as the driver never has to wait for the retrace)
*** This bug has been marked as a duplicate of bug 322060 ***
bug #329297 seems to have a false POSITIVE detection of triple buffering.
(In reply to comment #9)
> the dupe has a script attached that starts kwin and allows you to alter
> various env settings.
> please notice that exporting the yield strategy globally will impact every
> gl application - games may reply that by dropping fps.
> also yielding adjustment should not be necessary when triple buffering is
> enabled (as the driver never has to wait for the retrace)
I tried running just with triple buffering enabled, but eventually the problem seems to return, sometimes almost immediately, sometimes taking a bit longer (3-5 minutes). So I have re-enabled export __GL_YIELD="USLEEP" for now. Testing some games, and other OpenGL apps like the Unigine Valley demo they seem to run very good as I expect from previously, so if they are losing frames due to the yield strategy it is not bad enough for me to worry, or actually notice.
I have noticed that without yield strategy set, the problem seems more likely to be triggered by doing something that is KDE specific, like changing the wallpaper, opening the desktop config, or running other apps that are part of the KDE desktop environment - running non-KDE apps it seems the problem can stay away for a bit longer.
> *** This bug has been marked as a duplicate of bug 322060 ***
Looking at that bug report there are similarities, but logging CPU performance at no time is there any excess load on the Phenom 2 X6 - there is often an increase in fan speed on my GeForce card however. So I think this bug would be better shown as resolved, but rather as worksforme rather than duplicate?
I shall leave things for now, since the yield strategy seems to work well for me.
Chris W, NZ.
The cpu load can happen in the kernel, but the (assumed, we should check that next year) false positive detection of triple buffering as well as triple buffering not resolving the issue alone (it does show up as active in /var/log/Xorg.0.log, does it?) is indeed suspicious.
Could be that the refreshrate/maxfps is misdetected/overridden, we should check for that as well ... next year ;-)
The core problem from our side (default yielding strategy causing "trouble™" for the nvidia blob) is however the same.
(In reply to comment #12)
> The cpu load can happen in the kernel, but the (assumed, we should check
> that next year) false positive detection of triple buffering as well as
> triple buffering not resolving the issue alone (it does show up as active in
> /var/log/Xorg.0.log, does it?) is indeed suspicious.
:~> grep -i triple /var/log/Xorg.0.log
[ 22.835] (**) NVIDIA(0): Option "TripleBuffer" "1"
Its in xorg.conf for sure.
> Could be that the refreshrate/maxfps is misdetected/overridden, we should
> check for that as well ... next year ;-)
> The core problem from our side (default yielding strategy causing "trouble™"
> for the nvidia blob) is however the same.
Between enabling triplebuffer and the yield strategy, my KDE desktop is behaving and performing at a level that is more than satisfactory. OpenGL/CUDA apps like octane render, nexuiz, valley and heaven benchmarks, prey, webgl stuff, and blender/cycles render, are working very well. I'm happy to let things sit where they are since they seem stable, but if you want to get to why triplebuffer alone doesn't fix the issue just point me in a direction and I'll do what I can to help.
Happy New Year,
Chris W, NZ.
It would be very good to know whether triple buffering is reliably detected on the system.
To do so, you'd have to run "kdebugdialog --fullmode", filter for kwin (1212) and redirect all output to some file, e.g. /tmp/kwin.dbg
The file will after a short time (500 screen updates) contain a "Triple buffering detection" line which will indicate whether triple buffering is assumed to be available and the mean blocking time of glSwapBuffers().
If this works as expected, we'd face a new change in the nvidia driver (it doesn't change the implication to get it to use the "proper" yielding, but it would nevertheless be good to know, what the critical call is)
Thanks in advance and
Happy New Year to you as well.