Version: unspecified (using KDE 4.5.95) OS: Linux This happens even with blur and Lancoz filter disabled, simple theme used instead of Oxygen theme, and lxpanel used instead of Plasma, though it takes more time for performance to degrade then. This bug is also reported on the following forum topics: http://forum.kde.org/viewtopic.php?f=111&t=76983 http://forum.kde.org/viewtopic.php?f=111&t=91659 http://forum.kde.org/viewtopic.php?f=111&t=85676 Also this bug is mentioned in: http://www.linuxtoday.com/news_story.php3?ltsn=2010-10-21-008-35-RV-DT-KE-0004 And in countless other resources... Reproducible: Always Steps to Reproduce: 1. Open like 7 windows 2. Go to Desktop Effects settings 3. Switch from Texture from Pixmap method to Shared Memory and then back (this seems to reset some internal buffers, so it restores the fastest performance) 4. Try to resize some window, then minimize and maximize it 5. Work some time, but don't close the window 6. Try to resize the same window again, minimize and maximize it Actual Results: Just after changes in settings the window resizes pretty smoothly, the effects are perfectly fast, but after work, it resizes more choppy and effects become choppy too. However, if you do the same effect several times (like minimize and maximize window), the performance becomes smooth for short time again. Expected Results: Effect performance shouldn't degrade. My graphics card is NVidia GeForce GTS 250, I have no xorg.conf.
I've experienced the same.
there's probably a leak (/probably/ in the driver) which seems to only affect nvidia GPUs which are capable of CUDA/VDPAU (>= 8xxx) check "xrestop" on a fresh session and once again when things start to become slow.... also check whther the nouvea driver exposes such issue.
KWin's numbers in xrestop do not change much when it becomes slow, and when I "fix" the regression by minimizing-unminimizing a window or restarting KWin (after it restarts, it quickly gains the same numbers even before it starts to become slow).
wow, wait - simply minimize/unminimize (like hide/show) a random window fixes this? - do you have any minimization effect active (like "minimize" or "magic lamp") - can you effectively enter unredirected fullscreen mode (set konqueror fullscreen, rightclick the background, a popup shows up and the screen flickers...)
1. For some time, and then it degrades again. (Yes, I already written this in description, doing ANY effect over and over again fixes it for some time). 2. Yes, it flickers.
what if you keep running, say, glxgears? (or a little cooler: the glmatrix screensaver hack, or the snow effect or whatever casues some GPU load)
Still degrades... but slower. It's hard to estimate it, since KWin developers did a great job optimizing it, so it's now hard to find out if it lags...
BTW, the slowdown doesn't affect effects in progress, like I have never seen snow, cover switch or wobbly windows being choppy. Snowflakes only start lag while minimize effect is in progress, after minimize is done, they become smooth again.
And BTW I noticed that when KWin's slow, Cover Switch effect runs smoothly, but disappears (when windows fly to their old places) choppy.
And doing this (Alt+Tab, Alt+Tab, to trigger ending sequence multiple times) "fixes" the slowdown for some time, just like minimize/unminimize.
I also noticed that sometimes one effect may be smooth while other one is laggy. (like minimize is smooth, slide is laggy)
what kind of decoration do you use and does it help to choose non-animated oxygen or qtcurve or bespin with NO window borders but only the titlebar (and for oxygen + qtcurve: no integrated shadows either)
It happens here as well on any version of kde starting from 4.5.1 to 4.6.1 I have an Nvidia NVS 3100M Graphic card. It's the most annoying bug I've encountered so far. It goes away once I disable and then enable any effect. It happens about once a day on average.
I think that's not a bug - it's a powersaving feature of graphics card. I've set PowerMizer mode to Maximum Performance in NVIDIA Settings and now KWin is always fast, even with blur and doesn't degrade.
Yeah, it's because of powersaving. On performance level 0 GPU has 300 MHz graphics clock, 100 MHz memory clock and 600 MHz processor clock. Is it because KWin is inefficient or because it's simply not possible to have full-screen effects with such low frequencies?
And if it's not possible to have effects on performance level 0, it it possible for KWin to force the GPU to go to performace level 1?
Afair, PowerLevel 0 is MaxPerformance. The memory clock is however very likely too low (depends on the bandwidth & the resolution) The only generic way to keep the GPU up would be to put some constant load onto it (what's rather not the idea of "powersaving") since we can hardly predict when a user interaction will lead to high GPU demands. Also resizing is mostly an issue on the client - the WM is (except for maybe some decorations like aurorae) rather cheap in this regard. Here're some links for you to cross/read - you can force the performance level (basing on power source) in Xorg.conf (it's always the same stuff, as soon as you understood one article, you can drop the others) https://wiki.archlinux.org/index.php/NVIDIA#Forcing_Powermizer_performance_level_.28for_laptops.29 http://linux.aldeby.org/nvidia-powermizer-powersaving.html http://www.nvnews.net/vbulletin/showthread.php?t=110949 http://owened.net/2008/04/23/how-to-force-nvidia-powermizer-performance-in-linux
Compiz is able to perform good on lowest performance level (minimize/unminimize is smooth, etc.), so it seems that KWin is not as efficient as Compiz.
Hardsetting the GPU/VRAM freqs to 200/100Mhz here, compiz (tested present windows) stutters just as much as kwin (from an optical profiling ;-) Therefore yours is a pretty pointless statement (sorry - sounds like a phoronix "benchmark" ;-) because it includes no information about the used and affected effects, what kind of, whether they're comparable, if both on the same other environment. (Blurring is VRAM eg. intense) However, there /is/ a major difference between compiz that is the compiz decorators run directly in the GL context where kwin (still...) uses a quite nasty indirector to render the decoration (cause it was not written as and is not limited to be an opengl composited manager) and the nvidia driver mightily sucks at especially this point and especially with the oxygen or aurorae decorations and cannot have enough VRAM speed to at least halfwise cover this issue :-( (encounters not only but esp. whenever you change the active window) So if you want to provide qualified input on the performance of compiz & kwin you'll have to create and note a defined surrounding - ie. eliminate as much vars as possible. a) hardset set GPU performance (ie. do NOT rely on the -secret- governer and worse, the nvidia-settings GUI indicator) b) list all effect plugins on both systems as well as their settings c) list settings on both systems (vsync, in/direct rendering, scale method ("accurate" is heavy on shaders)) d) name the used decorators/settings (in case of kwin: "decoration" ;-) f) select and name the one effect where a difference shows up most g) what i forgot else. -> then also attempt to eliminate as many vars as you can and watch the result (shuttong down decorator and disabling decorations for all windows, turn off all apparently unrelated effects, align settings as good as you can, ...) This amount of information and structure and work may sound nasty, but ensures (iff) reproducible results and to bisect the issue - ie. find the determining variable. And that's the one and only difference between science and voodoo =)