Bug 479168

Summary: De-saturated Colors in HDR Mode
Product: [Plasma] kwin Reporter: Simon Berz <acc-kde>
Component: wayland-genericAssignee: KWin default assignee <kwin-bugs-null>
Status: RESOLVED UPSTREAM    
Severity: minor CC: xaver.hugl
Priority: NOR Keywords: qt6
Version: 5.91.0   
Target Milestone: ---   
Platform: Fedora RPMs   
OS: Linux   
Latest Commit: Version Fixed In:
Sentry Crash Report:
Attachments: Measurement reports, ICC profile, Display EDID, etc
drm_info, kscreen-doctor, edid, and ddcutil outputs for SDR & HDR modes on HDMI & DP

Description Simon Berz 2023-12-29 17:07:09 UTC
Created attachment 164545 [details]
Measurement reports, ICC profile, Display EDID, etc

SUMMARY
I have tested the color management and HDR features in Plasma. In SDR mode I get really accurate colors after profiling (thank you for implementing this).
In HDR mode the mapping seems to be broken. As far as I understood, SDR content should be displayed as sRGB per default and the option to "stretch" the colors to rec2020 via the "SDR Color Intensity" setting.
Leaving the SDR Color Intensity setting at 0% (default) results in completely de-saturated colors, while setting it to 100% (rec2020) results in fairly accurate sRGB colors for SDR content.
Playing back an HDR YouTube video (e.g ) with mpv also has the same de-saturated colors (the HDR part of it is working great though). Playing the video on the same monitor but with macOS and safari works fine with default display settings, so I'd guess the issue is not (entirely) a monitor one.

I did a bunch of DisplayCAL measurements in SDR and HDR mode. See attachment for measurement reports, ICC profile, EDID, etc.

STEPS TO REPRODUCE
1. Enable HDR mode using the default SDR Color Intensity setting of 0%
2. Look at any SDR content
3. Play with the SDR Color Intensity in Display settings compare the results (ideally measure them)

OBSERVED RESULT
Keeping the SDR Color Intensity setting of 0% results in really de-saturated colors. A value of 90% to 100% is required to get pretty accurate sRGB colors.

EXPECTED RESULT
The default Color intensity setting results in pretty accurate colors (as well as the display can reproduce them OOTB). Increasing the value results in more saturated colors.

SOFTWARE/OS VERSIONS
Linux/KDE Plasma: 
Operating System: Fedora Linux 40
KDE Plasma Version: 5.91.0
KDE Frameworks Version: 5.247.0
Qt Version: 6.6.1
Kernel Version: 6.6.8-200.fc39.x86_64 (64-bit)
Graphics Platform: Wayland
Processors: 24 × AMD Ryzen 9 5900X 12-Core Processor
Memory: 62.7 GiB of RAM
Graphics Processor: AMD Radeon RX 7900 XTX
Display: Dell Alienware AW3423DWF
Comment 1 Zamundaaa 2023-12-29 23:40:07 UTC
With rec.2020 content (like mpv with the Vulkan layer), the gamut mapping matrix is an identity matrix, so gamut mapping shouldn't be an issue.

Does your monitor have something where it shows information about the inputs? What you're describing sounds a lot like it thinks it's getting rec.709 content
Comment 2 Simon Berz 2023-12-29 23:51:53 UTC
(In reply to Zamundaaa from comment #1)
> Does your monitor have something where it shows information about the
> inputs? What you're describing sounds a lot like it thinks it's getting
> rec.709 content

Unfortunately not. It only shows port, resolution, and refresh rate.
Comment 3 Zamundaaa 2023-12-30 00:36:48 UTC
As a shot in the dark, could you check the output of drm_info on your system? Just to make sure the colorspace metadata is set properly
Comment 4 Simon Berz 2023-12-30 16:03:28 UTC
Created attachment 164556 [details]
drm_info, kscreen-doctor, edid, and ddcutil outputs for SDR & HDR modes on HDMI & DP

I added a new attachment with drm_info outputs. 

I also did more testing focusing on the monitor colorspace. 
It actually works fine on Linux using HDMI instead of DisplayPort!

On Linux HDMI always used YCbCr while DP always used RGB color formats (the monitor has a way to show this, just not in the info section where all other connection infos are). These formats are used regardless of refresh rate and also does not show any signs of chroma subsampling (so probably not a bandwidth limit). On macOS YCbCr is used for DP and HDMI and both show full colors. On Windows 10 I tested YCbCr and RGB over DP on the same hardware and both formats worked.

So the issue can be narrowed down to DislayPort or RGB color format (or a combination of both) on Linux. It starts to look more like a kernel issue to me but  I don't really now enough about the APIs to be certain.

This is probably a bit late to ask this but WCG and HDR should work on AMD without a patched kernel, right? As far as I followed/understood the developments in this space, HDR should work right now and the new proposals/developments are to move the processing from shaders to gpu hardware for better power efficiency.
Comment 5 Zamundaaa 2023-12-30 21:42:11 UTC
That's interesting. There's https://gitlab.freedesktop.org/drm/amd/-/issues/3060, but there the Colorspace property isn't exposed by the kernel... here it is, and it's set to the correct value. Might still be related.
It's worth opening a new issue on drm/amd for this either way, this does sound like a driver bug or maybe monitor quirk that needs to be addressed in the kernel.

> This is probably a bit late to ask this but WCG and HDR should work on AMD without a patched kernel, right? As far as I followed/understood the developments in this space, HDR should work right now and the new proposals/developments are to move the processing from shaders to gpu hardware for better power efficiency.

Yes.
Comment 6 Simon Berz 2023-12-30 23:54:05 UTC
Thank you for your help so far.
I opened https://gitlab.freedesktop.org/drm/amd/-/issues/3079
Comment 7 Zamundaaa 2024-01-01 03:05:05 UTC
This bug report made me test HDMI on my monitor too, and it looks much better with that for me too. I think it's certain to be a driver bug at this point, so let's exclusively track this on the amd issue