Bug 402583 - GPU not used on Blur effect
Summary: GPU not used on Blur effect
Status: RESOLVED NOT A BUG
Alias: None
Product: kwin
Classification: Plasma
Component: effects-various (show other bugs)
Version: 5.14.3
Platform: Gentoo Packages Linux
: NOR normal
Target Milestone: ---
Assignee: KWin default assignee
URL:
Keywords:
Depends on:
Blocks:
 
Reported: 2018-12-26 14:54 UTC by Vlad
Modified: 2023-01-18 12:29 UTC (History)
1 user (show)

See Also:
Latest Commit:
Version Fixed In:
Sentry Crash Report:


Attachments
full glxinfo output (18.58 KB, text/plain)
2018-12-26 14:54 UTC, Vlad
Details
output of qdbus org.kde.KWin /KWin supportInformation (5.54 KB, text/plain)
2018-12-26 16:22 UTC, Vlad
Details

Note You need to log in before you can comment on or make changes to this bug.
Description Vlad 2018-12-26 14:54:36 UTC
Created attachment 117120 [details]
full glxinfo output

I'm a big fan of KDE's blur. However, on my Thinkpad T500 it performs at under 10fps with a fullscrenn window or multiple smaller ones.
Turning on blur makes one core shoot up to basically be pinned 100%
The laptop uses the G45 chipset and the GMA4500 igp.
It has OpenGL 2.1 max capability.

According to a reddit post by a KDE dev(?) the blur requires openGL 3.0 [ https://www.reddit.com/r/kde/comments/7szqqk/i_implemented_a_new_blur_effect_in_kde_it/dt8vqmh ]

Apparently it doesn't use the GPU, but the CPU instead(?).
Either way the Desktop becomes unsuable. Since G45 laptops are quite popular as librebooted laptops (t400, x200, t500) it is sad, that the blur effect doesn't work in a usuable state on those laptops. (but it looks awesome)
On deepin btw, which I used for quite some time blur runs flawlessly.
Can this be somehow fixed? Is it maybe on my end?

STEPS TO REPRODUCE
1. Use gma4500 igp.
2. turn on blur
3. notice fps drop below 10 while cpu usage skyrockets

Linux/KDE Plasma: 
(available in About System)
KDE Plasma Version: 5.14.3
KDE Frameworks Version: 5.52.0
Qt Version: 5.11.1

ADDITIONAL INFORMATION
GLXinfo brief output, full output in attachments:

name of display: :0
display: :0  screen: 0
direct rendering: Yes
Extended renderer info (GLX_MESA_query_renderer):
    Vendor: Intel Open Source Technology Center (0x8086)
    Device: Mesa DRI Mobile Intel® GM45 Express Chipset  (0x2a42)
    Version: 18.2.7
    Accelerated: yes
    Video memory: 1536MB
    Unified memory: yes
    Preferred profile: compat (0x2)
    Max core profile version: 0.0
    Max compat profile version: 2.1
    Max GLES1 profile version: 1.1
    Max GLES[23] profile version: 2.0
OpenGL vendor string: Intel Open Source Technology Center
OpenGL renderer string: Mesa DRI Mobile Intel® GM45 Express Chipset 
OpenGL version string: 2.1 Mesa 18.2.7
OpenGL shading language version string: 1.20

OpenGL ES profile version string: OpenGL ES 2.0 Mesa 18.2.7
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 1.0.16
Comment 1 Vlad Zahorodnii 2018-12-26 15:39:35 UTC
Please post output of
    qdbus org.kde.KWin /KWin supportInformation
Comment 2 Vlad 2018-12-26 16:22:32 UTC
Created attachment 117122 [details]
output of qdbus org.kde.KWin /KWin supportInformation

Wow, super quick answer. THanks :]
Output in attachments.
Comment 3 Martin Flöser 2018-12-26 17:31:49 UTC
Most likely your gpu is not up to the task, but I think there are some things you can improve in your setup: first of all change the window decoration. Use default breeze. Blur behind windows is probably too expensive for your gpu.

That it increases CPU usage is probably nothing we can do about. Some drivers emulate functionality on the CPU without telling the software. It's damn stupid but gives them a check for a higher OpenGL Version and more games "work".
Comment 4 Vlad 2018-12-26 20:36:59 UTC
(In reply to Martin Flöser from comment #3)
> Most likely your gpu is not up to the task, but I think there are some
> things you can improve in your setup: first of all change the window
> decoration. Use default breeze. Blur behind windows is probably too
> expensive for your gpu.
> 
> That it increases CPU usage is probably nothing we can do about. Some
> drivers emulate functionality on the CPU without telling the software. It's
> damn stupid but gives them a check for a higher OpenGL Version and more
> games "work".

Ok, so nothing can be done about it. I'm pretty sure OpenGL2.1 is the correct max for this gpu. nothing is being emulated, except when it switches to full software rendering.
Is there something fundamentally different with KDE's blur compared to deepin?

Deepin has full blur behind windows aswell and there my GPU never failed to deliver 60fps on multiple windows or fullscreen windows...
Comment 5 Martin Flöser 2018-12-27 08:08:45 UTC
(In reply to Vlad from comment #4)
> (In reply to Martin Flöser from comment #3)
> > Most likely your gpu is not up to the task, but I think there are some
> > things you can improve in your setup: first of all change the window
> > decoration. Use default breeze. Blur behind windows is probably too
> > expensive for your gpu.
> > 
> > That it increases CPU usage is probably nothing we can do about. Some
> > drivers emulate functionality on the CPU without telling the software. It's
> > damn stupid but gives them a check for a higher OpenGL Version and more
> > games "work".
> 
> Ok, so nothing can be done about it. I'm pretty sure OpenGL2.1 is the
> correct max for this gpu. nothing is being emulated, except when it switches
> to full software rendering.

Nah, especially Intel used to emulate some extensions on the GPU. It's like you can do all of OpenGL 2.1 except for one or two extensions. So it emulates those to get the tag 2.1. Then games requiring OpenGL 2.1 (without checking extensions individually) start to work. We don't go for version but for extensions, so we then use the not supported extensions. I think this behavior of the Intel driver is damn stupid, but I can understand why they do it.

Our own source code documentation says that your driver does either GL 2 or 2.1 from hardware side. With a quick google I couldn't figure out what's really supported.

> Is there something fundamentally different with KDE's blur compared to
> deepin?
> 
> Deepin has full blur behind windows aswell and there my GPU never failed to
> deliver 60fps on multiple windows or fullscreen windows...

There's blur and blur. I don't know what algorithm Deepin uses, but for our old blur effect we used to have a fake blur. Depending on the level of transparency one can do very fast and cheap but bad looking blur.
Comment 6 Martin Flöser 2018-12-27 08:11:20 UTC
According to German Wikipedia it only does GL 2.0: https://de.wikipedia.org/wiki/Intel-4-Serie
Comment 7 Vlad Zahorodnii 2018-12-27 21:54:17 UTC
How did you measure FPS on KDE Plasma and Deepin?
Comment 8 Vlad 2018-12-28 10:04:14 UTC
(In reply to Vlad Zagorodniy from comment #7)
> How did you measure FPS on KDE Plasma and Deepin?

The below 10 fps comment comes from guessing. I activated the show fps effect as well, but it doesn't carry numbers. On deepin was meeting the display's refresh rate, since when dragging windows (for instance the fully blurred console) it perfectly smooth with no dropped frame. Didn't measure there as well.

(In reply to Martin Flöser from comment #6)
> According to German Wikipedia it only does GL 2.0:
> https://de.wikipedia.org/wiki/Intel-4-Serie

Super interesting! I didn't know of the emulations stuff. Can this somehow be checked, as to which extension is the offender?

There was talk about texture_barrier here [ https://phabricator.kde.org/D9848 ], which apparently is supported since opengl 3.1, maybe a candidate for this?
Comment 9 Vlad Zahorodnii 2018-12-28 12:49:55 UTC
According to https://github.com/linuxdeepin/deepin-wm/blob/master/src/Deepin/DeepinBlurEffect.vala, it looks like Deepin uses a 2-pass blur algorithm.

On the other hand, the blur effect in KWin uses an algorithm described in https://community.arm.com/cfs-file/__key/communityserver-blogs-components-weblogfiles/00-00-00-20-66/siggraph2015_2D00_mmg_2D00_marius_2D00_notes.pdf

In regards to the Show FPS effect, it can sometimes force blurring each frame.

As Martin said, most likely your GPU is not up to the task.
Comment 10 Vlad Zahorodnii 2023-01-18 12:29:11 UTC
Closing due to inactivity and comments above