Summary: | color saturation in blurred regions is higher than expected | ||
---|---|---|---|
Product: | [Plasma] kwin | Reporter: | Sawyer Bergeron <sawyerbergeron> |
Component: | compositing | Assignee: | KWin default assignee <kwin-bugs-null> |
Status: | RESOLVED FIXED | ||
Severity: | normal | CC: | david.decos, fabian, fredrik, kailash.sudhakar, michal.dybczak, mvourlakos, nate, sawyerbergeron, subdiff, tsujan2000 |
Priority: | NOR | Flags: | vlad.zahorodnii:
Intel+
|
Version: | 5.16.0 | ||
Target Milestone: | --- | ||
Platform: | Arch Linux | ||
OS: | Linux | ||
Latest Commit: | https://commits.kde.org/kwin/5191311d36fbbbe51a3c137f36148a662a099963 | Version Fixed In: | 5.16.3 |
Sentry Crash Report: | |||
Attachments: |
demonstration of blur color saturation on affected system
glxinfo output on wayland session solid-black-instead-of-full-blur-transparency-100% Grainy panel after update to 5.16.3 |
Seems like an intel bug. Can you post output of glxinfo? Created attachment 120814 [details]
glxinfo output on wayland session
(In reply to Vlad Zagorodniy from comment #1) > Seems like an intel bug. Can you post output of glxinfo? Gladly, anything else? The blur effect assumes that the default framebuffer has GL_SRGB color encoding, but on Intel it seems like that's not the case. :/ That explains why the blurred background is a bit darker. Hardware converts converts colors, which are in linear color space, from sRGB to linear color space. There is a way to check that at runtime by calling glGetFramebufferAttachmentParameteriv() with pname set to GL_FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING. *** Bug 408790 has been marked as a duplicate of this bug. *** *** Bug 408773 has been marked as a duplicate of this bug. *** @Fredrik The default framebuffer has GL_LINEAR color encodimng. (In reply to Fredrik Höglund from comment #5) > There is a way to check that at runtime by calling > glGetFramebufferAttachmentParameteriv() with pname set to > GL_FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING. I can test this if you want: 1. Is there any command that can rev reveal this? 2. If no [1] do you have a small testing program that I can build to return to you the output? Indeed it's the intel thing. When I switched to nvidia this dark saturation is gone. *** Bug 408915 has been marked as a duplicate of this bug. *** (In reply to Michail Vourlakos from comment #9) > (In reply to Fredrik Höglund from comment #5) > > There is a way to check that at runtime by calling > > glGetFramebufferAttachmentParameteriv() with pname set to > > GL_FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING. > > I can test this if you want: > > 1. Is there any command that can rev reveal this? > 2. If no [1] do you have a small testing program that I can build to return > to you the output? Vlad already tested it per comment #8. Possible fix: https://phabricator.kde.org/D21908 (In reply to Fredrik Höglund from comment #12) > (In reply to Michail Vourlakos from comment #9) > > (In reply to Fredrik Höglund from comment #5) > > > There is a way to check that at runtime by calling > > > glGetFramebufferAttachmentParameteriv() with pname set to > > > GL_FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING. > > > > I can test this if you want: > > > > 1. Is there any command that can rev reveal this? > > 2. If no [1] do you have a small testing program that I can build to return > > to you the output? > > Vlad already tested it per comment #8. > > Possible fix: https://phabricator.kde.org/D21908 The issue is only under X11? Here, the patch fixed the issue. Thanks a lot! I haven't tested KWin 5.16.1 under Wayland though. OK, tested and saw the bug was present under Wayland too -- the patch had no effect there. *** Bug 408949 has been marked as a duplicate of this bug. *** Created attachment 121192 [details]
solid-black-instead-of-full-blur-transparency-100%
Looks like some wallpapers could even change the 100% blur to 100% solid black (no blur) which shows the extents that bug can influence the look. In the attachment a screenshot of such solid black effect on latte-dock. Change of wallpaper introduced some partial transparency look so it does depends on the background somehow.
From my experience, white or light colors of the background reduce this dark effect, dark ones strengthen it. It's really annoying because it robs me of the control of how my system looks :(.
> It's really annoying because....
It's already fixed under X11. You could apply the patch or wait for it to come to your distro.
Ah, didn't know. Thanks for the info. I'll wait for the fix then. Is it scheduled for 5.16 series I hope? > Is it scheduled for 5.16 series I hope?
That's my question too -- I'm just a kwin user. I only know that the patch isn't applied to kwin 5.16.2.
Git commit 3d384f3c90205f35fea445446903661c7c046514 by Fredrik Höglund. Committed on 29/06/2019 at 11:09. Pushed by fredrik into branch 'Plasma/5.16'. glx: Prefer an sRGB capable fbconfig Prefer an sRGB capable fbconfig for the default framebuffer. Signed-off-by: Fredrik Höglund <fredrik@kde.org> M +26 -5 plugins/platforms/x11/standalone/glxbackend.cpp https://commits.kde.org/kwin/3d384f3c90205f35fea445446903661c7c046514 The patch lead to a regression in cirrus (default video device of openQA) and possibly needs to be reverted. Since it was missing in the commit message for reference the review of the patch was: https://phabricator.kde.org/D21908 Interesting parts of the debug output: qt.qpa.screen: adding QXcbScreen(0x5630281c1250, name="VGA-1", geometry=1024x768+0+0, availableGeometry=1024x732+0+0, devicePixelRatio=1.0, logicalDpi=QPair(96.0,96.0), physicalSize=270.0x203.0mm, screenNumber=0, virtualSize=1024x768 (1024.0x768.0mm), orientation=Qt::LandscapeOrientation, depth=16, refreshRate=60.0, root=2e4, windowManagerName="") (Primary: true ) qt.qpa.gl: Multithreaded OpenGL disabled: blacklisted vendor "Mesa Project and SGI" qt.qpa.gl: Force-enable multithreaded OpenGL by setting environment variable QT_OPENGL_NO_SANITY_CHECK kwin_platform_x11_standalone: Choosing GLXFBConfig 0x27e X visual 0x2d5 depth 32 RGBA 8:8:8:8 ZS 0:0 sRGB: 1 kwin_platform_x11_standalone: Created GLX context with attributes: Version requested: true Version: 2.1 Robust: false Forward compatible: false Core profile: false Compatibility profile: false High priority: false OpenGL vendor string: VMware, Inc. OpenGL renderer string: llvmpipe (LLVM 8.0, 256 bits) OpenGL version string: 3.1 Mesa 19.1.0 OpenGL shading language version string: 1.40 Driver: LLVMpipe GPU class: Unknown OpenGL version: 3.1 GLSL version: 1.40 Mesa version: 19.1 X server version: 1.20.5 Linux kernel version: 5.1.10 Requires strict binding: yes GLSL shaders: yes Texture NPOT support: yes Virtual Machine: no kwin_platform_x11_standalone: Direct rendering: true kwin_scene_opengl: OpenGL 2 compositing successfully initialized kwin_core: Instantiated compositing plugin: "SceneOpenGL" kwin_platform_x11_standalone: Using FBConfig 0x26e for visual 0x2c6 (In reply to Fabian Vogt from comment #23) > kwin_platform_x11_standalone: Choosing GLXFBConfig 0x27e X visual 0x2d5 > depth 32 RGBA 8:8:8:8 ZS 0:0 sRGB: 1 visual x bf lv rg d st colorbuffer sr ax dp st accumbuffer ms cav id dep cl sp sz l ci b ro r g b a F gb bf th cl r g b a ns b eat ---------------------------------------------------------------------------- 0x27e 32 tc 0 32 0 r y . 8 8 8 8 . s 0 0 0 0 0 0 0 0 0 None 0x2d5 32 tc 0 32 0 r y . 8 8 8 8 . s 0 0 0 0 0 0 0 0 0 None > kwin_platform_x11_standalone: Using FBConfig 0x26e for visual 0x2c6 visual x bf lv rg d st colorbuffer sr ax dp st accumbuffer ms cav id dep cl sp sz l ci b ro r g b a F gb bf th cl r g b a ns b eat ---------------------------------------------------------------------------- 0x26e 32 tc 0 32 0 r . . 8 8 8 8 . . 0 0 0 0 0 0 0 0 0 None 0x2c6 32 tc 0 32 0 r . . 8 8 8 8 . . 0 0 0 0 0 0 0 0 0 None (In reply to Fabian Vogt from comment #23) > qt.qpa.screen: adding QXcbScreen(0x5630281c1250, name="VGA-1", > geometry=1024x768+0+0, availableGeometry=1024x732+0+0, devicePixelRatio=1.0, > logicalDpi=QPair(96.0,96.0), physicalSize=270.0x203.0mm, screenNumber=0, > virtualSize=1024x768 (1024.0x768.0mm), orientation=Qt::LandscapeOrientation, > depth=16, refreshRate=60.0, root=2e4, windowManagerName="") (Primary: true ) This ^^^^^ is a quirk of cirrus: 16 bit depth (RGB565) is preferred. (In reply to Fabian Vogt from comment #24) > (In reply to Fabian Vogt from comment #23) > > qt.qpa.screen: adding QXcbScreen(0x5630281c1250, name="VGA-1", > > geometry=1024x768+0+0, availableGeometry=1024x732+0+0, devicePixelRatio=1.0, > > logicalDpi=QPair(96.0,96.0), physicalSize=270.0x203.0mm, screenNumber=0, > > virtualSize=1024x768 (1024.0x768.0mm), orientation=Qt::LandscapeOrientation, > > depth=16, refreshRate=60.0, root=2e4, windowManagerName="") (Primary: true ) > This ^^^^^ is a quirk of cirrus: 16 bit depth (RGB565) is preferred. With the commit reverted it works again and the debug output shows that indeed a 16 bit visual got chosen: kwin_platform_x11_standalone: Choosing GLXFBConfig 0x144 X visual 0x28e depth 16 RGBA 5:6:5:0 ZS 0:0 kwin_platform_x11_standalone: Using FBConfig 0x142 for visual 0x28c kwin_platform_x11_standalone: Using FBConfig 0x26e for visual 0x2c6 (In reply to Roman Gilg from comment #22) > The patch lead to a regression in cirrus (default video device of openQA) > and possibly needs to be reverted. Reverting this patch means having to disable gamma correction on Intel. Punishing every user with an Intel GPU in order to keep kwin working with a software renderer on a Cirrus chip that claims to support depth 24/32 even though it doesn't, is simply not acceptable IMO. So I'm going to solve this by blacklisting sRGB configs on LLVMPipe instead. But I would be just as happy to say that we simply do not support the kind of system you are describing. (In reply to Fredrik Höglund from comment #26) > (In reply to Roman Gilg from comment #22) > > The patch lead to a regression in cirrus (default video device of openQA) > > and possibly needs to be reverted. > > Reverting this patch means having to disable gamma correction on Intel. > > Punishing every user with an Intel GPU in order to keep kwin working with a > software renderer on a Cirrus chip that claims to support depth 24/32 even > though it doesn't, is simply not acceptable IMO. It seems to be a kernel default to use 16bpp by default. By using "cirrus.bpp=24" as kernel option, kwin_x11 works fine. > So I'm going to solve this by blacklisting sRGB configs on LLVMPipe instead. That sounds like a bit too much, everything except cirrus with 16bpp seems to work. > But I would be just as happy to say that we simply do not support the kind > of system you are describing. For master I'd be fine with that as well. Cirrus is a weird device/driver combo to work with and there's work ongoing to switch to -vga std in openQA by default. (at least three years too late, if you ask me). Since 5.16.3 is tomorrow, I'm guessing https://phabricator.kde.org/D22153 isn't going to quite make it in? Git commit 5191311d36fbbbe51a3c137f36148a662a099963 by Fredrik Höglund. Committed on 08/07/2019 at 22:43. Pushed by fredrik into branch 'Plasma/5.16'. [effects/blur] Disable sRGB when the framebuffer is linear Disable sRGB rendering when the color encoding of the default framebuffer is linear. FIXED-IN: 5.16.3 Differential Revision: https://phabricator.kde.org/D22153 Signed-off-by: Fredrik Höglund <fredrik@kde.org> M +28 -5 effects/blur/blur.cpp https://commits.kde.org/kwin/5191311d36fbbbe51a3c137f36148a662a099963 (In reply to Fabian Vogt from comment #27) > > So I'm going to solve this by blacklisting sRGB configs on LLVMPipe instead. > > That sounds like a bit too much, everything except cirrus with 16bpp seems to > work. Unfortunately we can't easily detect that the video device is a Cirrus device. The OpenGL driver can only tell us that it is llvmpipe; it doesn't know where the results of the rendering is going to be presented. (In reply to Fredrik Höglund from comment #30) > (In reply to Fabian Vogt from comment #27) > > > So I'm going to solve this by blacklisting sRGB configs on LLVMPipe instead. > > > > That sounds like a bit too much, everything except cirrus with 16bpp seems to > > work. > > Unfortunately we can't easily detect that the video device is a Cirrus > device. The OpenGL driver can only tell us that it is llvmpipe; it doesn't > know where the results of the rendering is going to be presented. Luckily that shouldn't be necessary, as we only know that llvmpipe + 16bpp is broken. Is detecting a 16bpp default framebuffer possible? (In reply to Fabian Vogt from comment #31) > (In reply to Fredrik Höglund from comment #30) > > (In reply to Fabian Vogt from comment #27) > > > > So I'm going to solve this by blacklisting sRGB configs on LLVMPipe instead. > > > > > > That sounds like a bit too much, everything except cirrus with 16bpp seems to > > > work. > > > > Unfortunately we can't easily detect that the video device is a Cirrus > > device. The OpenGL driver can only tell us that it is llvmpipe; it doesn't > > know where the results of the rendering is going to be presented. > > Luckily that shouldn't be necessary, as we only know that llvmpipe + 16bpp > is broken. Is detecting a 16bpp default framebuffer possible? That's something that we can do. I'll update https://phabricator.kde.org/D22203 Created attachment 121442 [details]
Grainy panel after update to 5.16.3
I just updated to 5.16.3, and my panel has gotten kind of grainy. That wasn't there before, so I went through the changelog and found this bug report. Since my card is an Intel, I thought it might be related.
In case it helps, I'm using Breeze Transparent Dark, but I switched to other transparent themes and the "grains" are there too, even in the non-dark ones.
I just updated to Plasma 5.16.3 on Manjaro Testing and I can confirm that the issue is gone. Blurred, transparent areas look colorless again. David de Cos, if what you are describing is about the blur effect, go to the blur settings in Desktop Effects and set blur strength. Some settings can create grainy result on purpose while others will look very, very blurry, as intended. Thanks, that worked. Although, lowering the noise strength to the minimum is what did the trick, not blur strength. I remember this setting did absolutely nothing before updating to 5.16.3, so I had it at a random value. Now I've lowered it to the minimum, since all that increasing it does (as far as my eyes see, at least) is create more and whiter grains. Git commit 4982dfd5f5ec408a19de48a1ada98f91497db48d by Fabian Vogt. Committed on 15/10/2019 at 14:07. Pushed by fvogt into branch 'Plasma/5.17'. glx: Don't use sRGB configs on llvmpipe with depth 16 Summary: This is necessary to keep openQA working, which uses LLVMpipe as a renderer on a Cirrus device that operates in depth 16. LLVMpipe advertises 24/32 bit sRGB configurations on this setup, but they cannot be presented. Test Plan: Compile tested only. Reviewers: fvogt, #kwin, zzag Reviewed By: fvogt, #kwin, zzag Subscribers: romangg, sbergeron, fvogt, kwin Tags: #kwin Differential Revision: https://phabricator.kde.org/D22203 M +16 -1 plugins/platforms/x11/standalone/glxbackend.cpp https://commits.kde.org/kwin/4982dfd5f5ec408a19de48a1ada98f91497db48d |
Created attachment 120810 [details] demonstration of blur color saturation on affected system SUMMARY STEPS TO REPRODUCE 1. Make some blurred + transparent region visible OBSERVED RESULT region behind blurred + transparent region is significantly more saturated than the actual region, making dark areas look almost black and ligher areas completely white. This is compounded when blurred regions are stacked. EXPECTED RESULT saturation appears similar to background region on average SOFTWARE/OS VERSIONS Linux/KDE Plasma: kernel 5.1.8 (available in About System) KDE Plasma Version: Plasma 5.16.0 (present in 5.15.9/5.16 beta) KDE Frameworks Version: 5.59.0 Qt Version: Qt 5.12.3 ADDITIONAL INFORMATION This is on a machine with an i7 7700HQ with integrated intel graphics, it is not reproduceable with identical binaries on a machine with a Nehalem era Xeon and Polaris AMD graphics (amdgpu driver) This occurs on fresh kde configs with a new user, as well as with older configs.