SUMMARY The SDR brightness setting is also changing the brightness of HDR content instead of HDR content displaying at the highest brightness/default monitor brightness. To get the full brightness for HDR, you have to turn the SDR brightness all the way up to 500 nits otherwise HDR is dim. This makes SDR content too bright. STEPS TO REPRODUCE 1. Update to 6.3 2. Enable HDR 3. Open HDR content/game OBSERVED RESULT HDR will be at SDR setting for brightness, rather than peak brightness. EXPECTED RESULT HDR should be following the monitor's max brightness/default brightness for HDR mode. SOFTWARE/OS VERSIONS Linux/KDE Plasma: Arch Linux KDE Plasma Version: 6.3 KDE Frameworks Version: 6.10.0 Qt Version: 6.8.2
Can confirm Operating System: Arch Linux KDE Plasma Version: 6.3.0 KDE Frameworks Version: 6.10.0 Qt Version: 6.8.2 Kernel Version: 6.13.2-2-cachyos (64-bit) Graphics Platform: Wayland Processors: 16 × AMD Ryzen 7 9800X3D 8-Core Processor Memory: 62.5 ГиБ of RAM Graphics Processor: AMD Radeon RX 6700 XT Manufacturer: ASUS
All content brightness is anchored to SDR brightness, and that's intentional. HDR content is at the "original" brightness when SDR brightness is at 203 nits; there's no "full brightness". If you set SDR brightness to 500 nits, HDR content will be roughly (ignoring tone mapping) 2.5x as bright as in 6.2.
(In reply to Zamundaaa from comment #2) > All content brightness is anchored to SDR brightness, and that's intentional. > > HDR content is at the "original" brightness when SDR brightness is at 203 > nits; there's no "full brightness". If you set SDR brightness to 500 nits, > HDR content will be roughly (ignoring tone mapping) 2.5x as bright as in 6.2. Is it possible for an application to ignore the SDR brightness setting while displaying HDR content? I'd like my HDR media to be shown unaltered but find 203 nits too bright for desktop use, so needing to adjust the SDR brightness every time an HDR file is played seems a bit tedious.
(In reply to Zamundaaa from comment #2) > All content brightness is anchored to SDR brightness, and that's intentional. > > HDR content is at the "original" brightness when SDR brightness is at 203 > nits; there's no "full brightness". If you set SDR brightness to 500 nits, > HDR content will be roughly (ignoring tone mapping) 2.5x as bright as in 6.2. I have a legitimate question because I am quite ignorant on the subject, but why is the "original" brightness of HDR content is at 203 nits SDR brightness? Why specifically this value? Thanks for your feedback!
(In reply to Zamundaaa from comment #2) > All content brightness is anchored to SDR brightness, and that's intentional. > > HDR content is at the "original" brightness when SDR brightness is at 203 > nits; there's no "full brightness". If you set SDR brightness to 500 nits, > HDR content will be roughly (ignoring tone mapping) 2.5x as bright as in 6.2. Isn't the point of the SDR brightness setting so that you can make SDR content looks normal when HDR is enabled? Why link them like this? wouldn't it have made more sense to add a separate HDR brightness slider instead of doing this? This was the main issue I had with windows HDR settings. Making SDR content look normal on windows is so much harder than it was in KDE 6.2, but now it's impossible to get it to look normal without messing up HDR in 6.3. I don't want to have to enable and disable HDR all the time in KDE like I did in windows.....
(In reply to Zamundaaa from comment #2) > All content brightness is anchored to SDR brightness, and that's intentional. > > HDR content is at the "original" brightness when SDR brightness is at 203 > nits; there's no "full brightness". If you set SDR brightness to 500 nits, > HDR content will be roughly (ignoring tone mapping) 2.5x as bright as in 6.2. I don't understand this decision. Why should the brightness of SDR content affect the brightness of HDR content? I echo the sentiment of the person above me, there should be separate sliders for SDR content and HDR paper white. I want my HDR content to follow the EOTF as closely as possible. At the very least rename the current slider to something else. The setting shouldn't have "SDR" in the name if it's a slider for HDR.
(In reply to Confined from comment #3) > Is it possible for an application to ignore the SDR brightness setting while > displaying HDR content? No. It could technically match the reference luminance without altering the content, but that's very heavily discouraged because it has a ton of negative side effects. > I'd like my HDR media to be shown unaltered but find > 203 nits too bright for desktop use, so needing to adjust the SDR brightness > every time an HDR file is played seems a bit tedious. There's no such thing as "unaltered", unless you're in the viewing environment the content was mastered in. I'll elaborate more below. (In reply to Geoffrey Chancel from comment #4) > I have a legitimate question because I am quite ignorant on the subject, but > why is the "original" brightness of HDR content is at 203 nits SDR > brightness? Why specifically this value? 203cd/m² is defined by BT.2408 as the reference white for content using the PQ or HLG transfer functions. In other words, in a HDR10 video, "SDR" things like subtitles are 203cd/m² bright, and the entire video's brightness is relative to 203cd/m². Windows games completely ignore these standards, but you can configure their brightness levels to work fine with this. (In reply to paul-serres from comment #5) > Isn't the point of the SDR brightness setting so that you can make SDR > content looks normal when HDR is enabled? Why link them like this? wouldn't > it have made more sense to add a separate HDR brightness slider instead of > doing this? What would that "HDR" slider apply to? Something that goes 10% above SDR brightness levels? 20%? 50%? Something only with specific transfer functions? If so, which ones? If you re-encode an SDR image with BT.2100 (as you might do with screenshots for example, to handle HDR content on the screen) and you'd have a separate brightness slider for BT.2100 content specifically, then that screenshot would look radically different from the image you just viewed on your screen. Qt applications will soon start using scRGB for rendering with HDR content in their windows, doing such a separate brightness slider would completely wreck the brightness of "SDR" things in their windows too. So no, a separate "HDR" brightness slider is out of the question. The point of the "Maximum SDR Brightness" slider is merely to set what 100% means for the normal brightness slider, because HDR screens are terrible and their self-reported values can't be trusted. It'll be moved into a calibration page sooner than later. (In reply to TheFeelTrain from comment #6) > I want my HDR content to follow the EOTF as closely as possible. Unless you're matching the viewing environment to the one the content was mastered in, that's just plain nonsense. If your room is brighter than the one the content was mastered in, then you need to view the content with increased brightness, or it will look darker than it's supposed to. If your room is darker than the one the content was mastered in, then the content needs to be presented with decreased brightness, or it'll look brighter than it's supposed to. (In reply to TheFeelTrain from comment #6) > At the very least rename the current slider to something else. The setting > shouldn't have "SDR" in the name if it's a slider for HDR. Suggestions would be welcome. It's called the "Maximum SDR Brightness" to have some attachment of what it controls. Calling it "Maximum Reference Luminance" would be more technically correct, but most people seeing it wouldn't understand what it means.
(In reply to Zamundaaa from comment #7) > (In reply to Geoffrey Chancel from comment #4) > > I have a legitimate question because I am quite ignorant on the subject, but > > why is the "original" brightness of HDR content is at 203 nits SDR > > brightness? Why specifically this value? > 203cd/m² is defined by BT.2408 as the reference white for content using the > PQ or HLG transfer functions. In other words, in a HDR10 video, "SDR" things > like subtitles are 203cd/m² bright, and the entire video's brightness is > relative to 203cd/m². > > Windows games completely ignore these standards, but you can configure their > brightness levels to work fine with this. Thank you very much for your feedback, I understand it now. Do you think it would be a feasible thing to put a small bubble in the display settings like it's the case for the other settings to explain to the users what you explained to me? Sio that everyone can understand that 203 is the SDR brightness according to the standard. Best regards.
(In reply to Geoffrey Chancel from comment #8) > Thank you very much for your feedback, I understand it now. > Do you think it would be a feasible thing to put a small bubble in the > display settings like it's the case for the other settings to explain to the > users what you explained to me? Sio that everyone can understand that 203 is > the SDR brightness according to the standard. No, that would just cause more confusion. If you're concerned with setting up some specific viewing environment from a standard and then configuring the software to match that, then you already know the value for that viewing environment (which may not be the same as from BT.2408). If you don't know about the viewing environment standards, then it's better that you do not know any misleading numbers. Just like everyone knows from phones and laptops, the brightness slider is there to be at whatever value you're comfortable with.
> Suggestions would be welcome. It's called the "Maximum SDR Brightness" to > have some attachment of what it controls. Calling it "Maximum Reference > Luminance" would be more technically correct, but most people seeing it > wouldn't understand what it means. Most games call this "HDR Paper White" so that would be recognizable for most people. However this illustrates the problem with this setting, as games let you set this value in the their own settings. So unless the system setting is 203, the in-game setting is now going be inaccurate. For example I want my desktop to be at 100 nits because that's what I run my monitor at in SDR, but if I were to set Plasma's slider to 100 and the game's paper white setting to 200 as I usually would, everything is half the brightness it should be. It gets even more messy if I set the Plasma slider to something like 300, which is fairly normal for people to use. If the in-game setting is also set to 300, now in-game is 1.5x the brightness it should be. No reasonable person would expect it to work this way. A value of 300 should result in an output of 300. Not only is this an added layer of complexity, but it is completely invisible to the end user unless they happen to stumble across this bug report like I did. Nobody was confused about how it worked before. Now you have multiple people here who were confused enough to post about it. You've completely decoupled the in-game value from the actual output value. > Unless you're matching the viewing environment to the one the content was > mastered in, that's just plain nonsense. > If your room is brighter than the one the content was mastered in, then you > need to view the content with increased brightness, or it will look darker > than it's supposed to. > If your room is darker than the one the content was mastered in, then the > content needs to be presented with decreased brightness, or it'll look > brighter than it's supposed to. How is it nonsense? While I lack the means to measure it, I imagine my room is pretty close to a reference environment. I have black out curtains and the lamp I use is extremely dim. I run my monitor at exactly 100 nits for SDR and it is perfect. I was completely happy with how it worked in 6.2. Just like someone else already mentioned here now there is no way for me to have SDR content at 100 nits while also having HDR content at 203 at the same time. > What would that "HDR" slider apply to? Something that goes 10% above SDR > brightness levels? 20%? 50%? Something only with specific transfer > functions? If so, which ones? It is strange to me that you need to ask this. One slider for SDR surfaces and one slider for HDR surfaces is not some crazy suggestion. It's obvious it's possible to have separate settings because it already worked that way in 6.0, 6.1, and 6.2. Maybe in the future all content will be HDR but for right now there is a very clear distinction between what is HDR and what is not.
I use hdr for gaming and movies, why is such a downgrade necessary? I know technicality is most important for some, but for regular user this makes a little sense. We had a very good implementation before and now it is impossible to use without manual switching. Are there at least any plans to introduce actual SDR brightness in the future?
(In reply to Zamundaaa from comment #7) > Qt applications will soon start using scRGB for rendering with HDR content > in their windows, doing such a separate brightness slider would completely > wreck the brightness of "SDR" things in their windows too. This seems like a good point. The issue right now is, I don't want white websites and documents to blast my eyes out at 203 nits, but still want games and videos to be bright and look nice. In 6.2 this was achieved by having HDR content not be affected by SDR brightness (since media is often HDR and other apps are SDR), but if Firefox or LibreOffice suddenly started rendering in HDR then they would have been too bright. It seems like more of a media vs productivity issue than HDR vs SDR. It's almost like every program needs their own SDR Content Brightness slider, or to read it from a system-wide setting. I thought about per-app brightness sliders, like we have for audio volume, but then HDR content inside of productivity applications would be affected, and brightness is already equal across apps unlike volume which can be all over the place.
Hi, letme also jump i here, just to be sure that things work as they should. As the other people, i would like to have my desktop sdr at ~100 Nits. My oled monitor reaches full screen brightness of 250 nits. So i have set 250 for the "max sdr brightness value" and adjust the brightness slider to 40% (even less atm cause that's really bright, have to check if that matches windows brightness behaviour). But as mentioned here in the thread, that also would affect the brightness of hdr games as i only have 100 nits set as reference that way and not 203 ? Many thx !
(In reply to TheFeelTrain from comment #10) > Most games call this "HDR Paper White" so that would be recognizable for > most people. However this illustrates the problem with this setting, as > games let you set this value in the their own settings. So unless the system > setting is 203, the in-game setting is now going be inaccurate. > > For example I want my desktop to be at 100 nits because that's what I run my > monitor at in SDR, but if I were to set Plasma's slider to 100 and the > game's paper white setting to 200 as I usually would, everything is half the > brightness it should be. It gets even more messy if I set the Plasma slider > to something like 300, which is fairly normal for people to use. If the > in-game setting is also set to 300, now in-game is 1.5x the brightness it > should be. No reasonable person would expect it to work this way. A value of > 300 should result in an output of 300. That's not how that works. Neither on Android, nor iOS, nor MacOS, nor on any TV, nor on Windows laptops. Only Windows pretend that it's the case on desktop monitors, but that's not intentional but instead a serious design flaw that's kept for backwards compatibility. Even there it's not really true because monitors do their own things with the image. But I fully agree that the vast majority of Windows games do HDR quite badly and in a way that confuses a lot of people. The fact that you have to configure brightness settings per game in 2025 is ridiculous! > How is it nonsense? While I lack the means to measure it, I imagine my room > is pretty close to a reference environment. I have black out curtains and > the lamp I use is extremely dim. I run my monitor at exactly 100 nits for > SDR and it is perfect. There is not "a" reference environment, there are many, for different standards. With BT.2408, 203 nits is the comfortable average brightness, iow the correct one for SDR content. If that's too bright for your room, then you do not have the BT.2408 viewing environment but something darker. > > What would that "HDR" slider apply to? Something that goes 10% above SDR > > brightness levels? 20%? 50%? Something only with specific transfer > > functions? If so, which ones? > > It is strange to me that you need to ask this. One slider for SDR surfaces > and one slider for HDR surfaces is not some crazy suggestion. It's obvious > it's possible to have separate settings because it already worked that way > in 6.0, 6.1, and 6.2. It did not work like that. BT2020PQ and scRGB had some special cased logic for the frog protocol / gamescope specifically, which did not work with any other HDR content and made brightness control and tone mapping more difficult and limiting. (In reply to klaymorer from comment #12) > This seems like a good point. The issue right now is, I don't want white > websites and documents to blast my eyes out at 203 nits, but still want > games and videos to be bright and look nice. In 6.2 this was achieved by > having HDR content not be affected by SDR brightness (since media is often > HDR and other apps are SDR), but if Firefox or LibreOffice suddenly started > rendering in HDR then they would have been too bright. It seems like more of > a media vs productivity issue than HDR vs SDR. Adding a window action or something to change the brightness of an individual window could be reasonable, so you could increase or decrease the brightness of a game or movie without affecting the rest. In theory, we have protocols to allow apps to specify whether or not their window contains a game or video, but in practice for this use case it wouldn't be too useful as it's neither that widely used yet. It also can't really deal with for example web browsers showing a video or game inside of a website. (In reply to bugreports61 from comment #13) > letme also jump i here, just to be sure that things work as they should. > As the other people, i would like to have my desktop sdr at ~100 Nits. Please, get that brightness target out of your head. Unless you set up a specific *measured* viewing environment, you really should not care about any numbers here. Reference viewing environments are guidelines for what whitepoint, display and environment brightness levels, amount of glare etc. are good for content production, not something that you as the consumer are supposed to recreate! Just set up the brightness to be what you're comfortable with. All content, HDR or not, will be adjusted to match, and there's nothing more to it than that.
> (In reply to bugreports61 from comment #13) > > letme also jump i here, just to be sure that things work as they should. > > As the other people, i would like to have my desktop sdr at ~100 Nits. > Please, get that brightness target out of your head. Unless you set up a > specific *measured* viewing environment, you really should not care about > any numbers here. > > Reference viewing environments are guidelines for what whitepoint, display > and environment brightness levels, amount of glare etc. are good for content > production, not something that you as the consumer are supposed to recreate! > > Just set up the brightness to be what you're comfortable with. All content, > HDR or not, will be adjusted to match, and there's nothing more to it than > that. I don't care about a arbitrary number, but thats 40% brightness of my display (250nits max fullscreen) and i find it comfortable for my eyes. Therefore i mentioned those 100nits.
(In reply to Zamundaaa from comment #14) > That's not how that works. Neither on Android, nor iOS, nor MacOS, nor on > any TV, nor on Windows laptops. I'll admit I have no experience with HDR on mobile devices or macOS, but most HDR TVs worth buying will try to follow the EOTF if you simply set the brightness to max. It's not blinding because you're sitting more than 1 meter away from the screen, and most of the time spent on a TV isn't spent looking at UI elements like on a monitor anyway. There's also a lot of TVs will run the UI at one brightness and then jump to max brightness once you're actually watching HDR content. Some TVs and monitors will also lock you out of controlling the brightness in HDR mode entirely. And I don't think bringing iOS and Android into the conversation is even relevant. Nobody cares about the brightness curves of their phone screen. Anyone who wants a proper HDR viewing experience is not using their phone to watch movies in the first place. > Only Windows pretend that it's the case on desktop monitors, but that's not > intentional but instead a serious design flaw that's kept for backwards > compatibility. Even there it's not really true because monitors do their own > things with the image. The implication here is that you want to willingly break backwards compatibility, which isn't exactly ideal. Even if you disagree with how desktop HDR currently works, now you're making it even more confusing to the users who are already used to it. This is a classic case of "the current standard sucks, let's make a new better standard" and now there's two competing standards. I'm not going to sit here and say you need to copy what Windows does. But it should at least be intuitive. Like I said before, nobody was confused how it worked prior to 6.3. Now you have a lot of people who were confused enough to post about it. Even if the implementation was more complicated, nonsense, whatever-- it resulted in a smoother, less confusing user experience. At the very, *very* least there needs to be some explanation in the settings that anything below 203 will result in the full range of your monitor not being utilized. The worst part of the change is how it's done without the user's knowledge. > Just set up the brightness to be what you're comfortable with. All content, > HDR or not, will be adjusted to match, and there's nothing more to it than > that. There is more to it than that. It is not as simple as having one brightness that applies in every scenario. I paid extra for a full HDR1000 monitor with local dimming. I don't want my desktop environment to cut that down to an HDR500 monitor just because I don't want to be blinded when I browse the web or look at a spreadsheet. The problem is *when* and *where* the brightness is happening, not the brightness curve as a whole. I don't mind having the full 1000 nit highlights for things like fire, explosions, lightning, etc. I don't understand why you think the comfort level for desktop use is a 1:1 correlation with comfort level for watching a movie or playing a game.
I think most of us just want the ability to move SDR content along the color space of HDR so that it looks "correct" to us. whether that is technically correct or not is down to personal preference. In my case, I just want SDR content to look the same, or almost the same, regardless of if I have HDR enabled or not, that way I can just set and forget the HDR settings. In 6.2 I could achieve this using the SDR brightness setting and sRGB color intensity settings. My monitor specifically needs SDR brightness to be set to 600 and color intensity to be set to 100 for SDR to look the same as it does with HDR off. But now that makes HDR get blown out, and 203 is too dim for SDR content for me. I don't know a lot about how HDR works, but to my understanding, HDR is just an expanded color range/light intensity range. Is it possible to "shift" the SDR range along the larger HDR range relative to that SDR brightness setting? Or maybe map SDR content to the HDR range in some way?
(In reply to TheFeelTrain from comment #16) > I'll admit I have no experience with HDR on mobile devices or macOS, but > most HDR TVs worth buying will try to follow the EOTF if you simply set the > brightness to max. No, they don't. The vast majority of TVs can't even go above 200 nits, they can't "follow the EOTF" without making the image terrible. TVs do a ton of processing, including tone and gamut mapping, dynamically changing brightness and so on to make the image look good with the limited capabilities of the display. And as you noticed yourself, they have *one* brightness setting just like every other sane system, not multiple for different kinds of content. > The implication here is that you want to willingly break backwards compatibility, which isn't exactly ideal There neither is, nor ever has been "backwards" compatibility with Windows applications. We make Windows games work as well as is possible without making Linux applications suffer for it. That line will not be crossed, and that's not up for debate. > I'm not going to sit here and say you need to copy what Windows does. But it should at least be intuitive Users configuring many confusing and differently named numbers in each game just to work with one of their displays is never going to be intuitive. The only way to make it intuitive is applications being Wayland native and supporting the APIs we provide. Or at least using the actually not too different APIs that Windows provides for this purpose, so that Wine could map them. > At the very, *very* least there needs to be some explanation in the settings that anything below 203 will result in the full range of your monitor not being utilized. > I paid extra for a full HDR1000 monitor with local dimming. I don't want my desktop environment to cut that down to an HDR500 monitor just because I don't want to be blinded when I browse the web or look at a spreadsheet. The problem is *when* and *where* the brightness is happening, not the brightness curve as a whole. I don't mind having the full 1000 nit highlights for things like fire, explosions, lightning, etc. That's not how HDR works. Setting the reference luminance to 100 nits does not mean that the maximum brightness gets limited to 500 nits. Setting it to 203 nits does not mean the full brightness range of the monitor gets used, and setting it to 10 doesn't guarantee it does not get used. > But now that makes HDR get blown out If you're talking about videos, that's something we can definitely still improve. The tone mapping curve we have right now is usable but we can do better. If you're talking about games, tone mapping them is more difficult because they rarely provide correct HDR metadata. You should however be able to configure them for whatever display settings you're using. If some game has artificial limitations that prevent that, then you'll have to either turn the brightness down or turn its HDR setting off. > Is it possible to "shift" the SDR range along the larger HDR range relative to that SDR brightness setting? Or maybe map SDR content to the HDR range in some way? No. SDR content gets mapped to the reference luminance, and nothing else is possible without breaking tons of things.
(In reply to Zamundaaa from comment #18) > No, they don't. The vast majority of TVs can't even go above 200 nits, they > can't "follow the EOTF" without making the image terrible. > TVs do a ton of processing, including tone and gamut mapping, dynamically > changing brightness and so on to make the image look good with the limited > capabilities of the display. Okay? I specifically said "TVs worth buying." I'm not talking about $200 TCL and Amazon Fire TVs I'm talking about TVs actually capable of hitting at least 1000 nits. You should not be basing the HDR experience on displays not even capable of proper HDR. > And as you noticed yourself, they have *one* brightness setting just like > every other sane system, not multiple for different kinds of content. No I didn't. It's like you didn't even read my comment. "There's also a lot of TVs will run the UI at one brightness and then jump to max brightness once you're actually watching HDR content. Some TVs and monitors will also lock you out of controlling the brightness in HDR mode entirely." They explicitly have one brightness setting for the UI and one brightness setting for watching HDR content. That is two brightness settings. > That's not how HDR works. Setting the reference luminance to 100 nits does > not mean that the maximum brightness gets limited to 500 nits. Setting it to > 203 nits does not mean the full brightness range of the monitor gets used, > and setting it to 10 doesn't guarantee it does not get used. Then explain how it works. That's my biggest problem with this whole thing. Nobody knows how it works. I am going off of what you said earlier: > If you set SDR brightness to 500 nits, HDR content will be roughly (ignoring tone mapping) 2.5x as bright as in 6.2 How do this not mean HDR content will be 0.5x if I set SDR Brightness to 100? Can you explain what this setting is *actually* doing then? You are legitimately the only person on planet Earth who knows how Plasma's HDR functions. All we have to go off of is what you've said here. Again, that's one of my biggest issues with this change. It's confusing.
(In reply to Zamundaaa from comments #7, #14, #18) > HDR screens are terrible and their self-reported values can't be trusted. > The vast majority of TVs can't even go above 200 nits, they can't "follow the EOTF" without making the image terrible. How about we don't make assumptions about user hardware? If self-reported values can't be trusted, why does KWin perform tonemapping based on the max luminance reported by the monitor's EDID? Heck, why even support HDR at all if this is the case? > The fact that you have to configure brightness settings per game in 2025 is ridiculous! > Adding a window action or something to change the brightness of an individual window could be reasonable, so you could increase or decrease the brightness of a game or movie without affecting the rest. Is per-application brightness control ridiculous or reasonable? Please clarify. > All content, HDR or not, will be adjusted to match, and there's nothing more to it than that. That's the issue. Comments on this thread and others clearly and consistently indicate that we do not want this behavior. You're correct in that a non-reference display in a non-reference viewing environment should be adjusted accordingly. Since we can't expect every display to have an ambient light sensor and do this automatically, it's fair to give users the option to make this adjustment according to preference. That said, I do not believe it is at all reasonable to expect that users will have the same preference for both SDR and HDR content. If the only solution you're willing to consider is adjusting this setting per-window via rule, then fine, I can live with that. But this needs to the remedied.
*** You could skip ahead to the next comment, but I do suspect there's good info here *** There's a lot of people mutually not understanding each other here I think. And I'm one of them too. I have no idea what any of these sliders mean now or why they're affecting what they're affecting, and the explanation here has not managed to penetrate my brain. As far as I'm aware, HDR prior to 6.3 worked exactly as I wanted and expected it to. When I set the SDR brightness slider to 250 nits, SDR content (and crucially, my desktop) displayed between 0 and 250 nits. Then when I opened an HDR video in MPV with all the flags and env variables, it displayed the HDR video between 0 and 1000 nits, which was the range of my monitor (oled, if it matters). That's exactly what I wanted. After the 6.3 update, HDR video is now noticeably *not* reaching 1000 nits, and I don't understand why. And honestly, I don't want to. I want my OS to just let me make my desktop 250 nits max, but then actually show me HDR content in HDR regardless of that setting. Currently, what I've (possibly incorrectly) taken from this thread is that anytime I want to watch an HDR video I have to: 1. Open settings go to my display, 2. set the SDR brightness slider to max, blow out my eyeballs with my now way too bright desktop *then* 3. I can open my HDR video and see it at my monitor's full range. Then (4.) when I'm done I blow out my eyes again before I can tab back to settings and set it back! What???? Look, I don't care in the least what some bt.2020 standard says about comfortable white or some other thing I don't understand. I got an 1000 nits peak brightness monitor, and when I have an HDR video with brightness information for 0 to 1000 nits, I'd like my monitor to just do that. That's a feature request from me. If that was a setting, I'd turn it on. If there's some reason why HDR content *needs* to be darkened below my monitor's range, please, in some way, make that independent of my desktop's brightness. Like, do an "SDR Brightness Range" slider where I could pick some numbers. I'd pick 0 to 250. Or, if for some reason I needed to raise blacks, I could set, say, 20 to 250 (nits?). Then that would immediately make sense next to a "HDR Brightness Range" slider that would allow me to pick a brightness range between 0 to 1000, or 20 to 500, or whatever. Or if three numbers are required, please let me just set them. 0 for the bottom of the range, 300 for some 'reference white' or something, then 1000 for peak brightness. If it's a perception thing, let me use my perception to pick numbers that work for my environment. Then if some HDR content *needed* to be dimmed, it wouldn't mess with my preferred desktop brightness. And messing with my desktop brightness wouldn't mess with my hdr content consumption. *** I think the below is the most important part, feel free to only bother with it *** I have failed to be succinct, but one more thing anyway: I think I gather from the comments that the desktop brightness is being used as like a 'reference white'. But I think this assumption is wrong: The comfortable desktop white is not necessarily my desired HDR content 'reference white'. These are fundamentally different tasks, and shouldn't be linked. I like my desktop very dim, as it's better for my astigmatism for reading text. But that's really not relevant for an HDR film or game 99.9% of the time. It's a different thing. I *do* want my brighness higher for content. Like, HDR aside, when you're watching a movie, do you not hit the brightness up hotkey a few times? Then when you're done, you set it back? That's what I mean. Desktop white isn't content white. Okay finally done now. Sorry for the length, sorry for the repetition. But I think it accurately communicates the way this current functionality is bothering me and why. And I wanted to say a huge thank you to you, Zamundaaa, as well. Your work on HDR is what finally took away my last anchor to Windows, and now I'm Linux desktop on everything which has been a major improvement in almost every way.
> After the 6.3 update, HDR video is now noticeably *not* reaching 1000 nits, and I don't understand why. And honestly, I don't want to. I want my OS to just let me make my desktop 250 nits max, but then actually show me HDR content in HDR regardless of that setting. With your settings, HDR content is 23% brighter than in 6.2. If some HDR video looks darker to you than before, feel free to make a separate bug report about it. It's not related to this. > Like, HDR aside, when you're watching a movie, do you not hit the brightness up hotkey a few times? Then when you're done, you set it back? That's what I mean. Desktop white isn't content white. No, I don't. But just like has been discussed here before, what you're actually asking for is a way to change the brightness depending on the content type, not depending on a meaningless differentiation between "HDR" and "SDR".
I would also like to add to and repoen this as an issue with the following findings (with a bit of tangent to add context) For my specific display that has a peak brightness of 1000 nits to achieve full range accuracy during HDR I require a reference brightness of 100 nits hence I have to set the current slider to 100, these are the setting I use to master content in as an unreal engine game developer, I also have watched media and played multiple titles with these settings and it looks fantastic with an excellent and precise range of contrast from shadow to highlight with great color accuracy. This however is where the issue comes in, if I am now to switch to any SDR applications or use my desktop itself outside of whatever HDR content I am trying to consume or master, it is incredibly dim, so I have to now increase the slider up, usually around ~300 to view comfortably. Doing so destroys the range of any HDR content I may still be trying to view, raising all of the mid range, and completely blowing out highlights (I have found with many titles setting the Maximum brightness anywhere above 100 will begin to degrade accuracy of mids and will always blow out highlights, by the time I am at around a viewable brightness for SDR, HDR looks incredibly overexposed and unusable). So I now have to put the slider back to 100 to resume whatever HDR content I am trying to. before this change in 6.3 happened I simply set the two different values, HDR looked perfect, and SDR was not overtly dim and I could swap between the two happily without requiring opening the settings menu to change anything. what I propose is bringing back 2 settings: SDR brightness: this will map / shift the SDR brightness range within the HDR range whilst HDR is enabled ensuring usability for desktop and SDR applications (e.g. as I said before I would have this set to something ~300 to use comfortably depending on my environment) HDR Reference brightness: this will be the reference white brightness for HDR content (e.g. I would have this set to 100 for my light controlled environment or 203 for a lit environment to ensure visibility minus a little accuracy, I also have to make this change depending on what I am trying to consume, as some content is clearly mastered outside of / based on a different standard) This change would allow the user to respect the standards for HDR and any content viewed in it whilst also being able to view SDR content comfortably without having to constantly open the settings to change sliders. TLDR; in my light controlled environment SDR will be far too dim in a situation to achieve accurate HDR without blowing out highlights and de-gradating mid range accuracy because of the raised midpoint from this setting currently, so being able to allow the user to change the SDR mapped brightness range irrespective of the HDR reference brightness would be paramount to good user experience.
I think, trying to parse this, is that separate brightness ranges depending on content type would be a feature request. Since the current functionality is, in a narrow sense, how HDR 'should' work. As end-users this implementation is incomplete/broken, not in the technical sense, but in a usability sense. So, potential Feature Request: Improve system usability on HDR displays by allowing HDR content to have its own distinct brightness setting separate from SDR content. LewisT's comment is a good example of a use case for that feature. I realize in the back-end there's not necessarily a hard distinction. But in the font-end there absolutely is, and this implementation doesn't work the way I (and many others) want it to. This is one of those 'end-users are wrong' cases where the end-users are actually right, because we're not trying to get the most programatically correct outcome, we're trying to get the most 'usable' outcome. And this current setup is not as usable. If in future this leads to more work, as the desktop itself maybe becomes less distinct as 'SDR', then I think that work is highly worthwhile. My computer is most *usable* when these concepts are different. If that's an abstraction from what's really happening, then please abstract it away. I want content with 'brightness data' to be treated one way, and content with the old way of only color data to be treated another. Apologies if I'm right and I could have just made a feature request, but I don't know how. Are feature requests bugs in KDE land? Or ideally someone with a better idea of how this works could do it, but if no one does I'll have a go, it's pretty much essential to the usability of my desktop.
After reading the whole thread, iI Think the problem is that different media types reauirerequire Different settings. There could be a general SDR brightness slider and a general HDR one (just like in 6.2). Additionally, separate HDR sliders could be available for different media formats (for example "vídeo", "game", "office") and choose the correct media type based on the .desktop file of the application.
I don't think we need some complex or perfect solution - things like this are why I made an account here so I can voice my opinion as a user - what we (me and many others as seen above) is a way to adjust SDR brightness, and a way to adjust HDR brightness - nothing more or less than that. Perfection is not required, technically "correct" is not required, close enough and easy to use is what we want - no one wants to do mathematical equations or constant adjusting to get things as perfect as possible when we could just be adjusting a few EASY TO UNDERSTAND values that don't require advanced knowledge or knowledge limited to the confines of technical threads to understand. To be absolutely plain and honest - linking SDR brightness to HDR brightness is extremely narrow viewed, I ask the question "Who wants to be flashed with SDR content so bright it gives you a migraine just to get the correct brightness for HDR content?", and the answer is almost nobody. Another question is "What kind of users would understand that something labeled Maximum SDR Brightness also changes the brightness of HDR content?", the answer is literally almost exclusively the people that have seen this thread and I can not make it more clear that this does not make sense to any end user.
To add on to my last post, this is not an angry rant or some attack, I appreciate the work that has been put into every aspect of my operating system every second that I use it, more reason to want things easier for everyone to use.
I also think it is a very strange decision. It can't be that every time I want to watch a movie, watch YouTube or play a game, that I have to adjust the display brightness to 203 nits to get proper HDR. If you really want this to be the way, then at least it should _always_ be tone/brightness mapped to peak brightness. So the overall content can be lower or higher brightness, but the peak brightness of the HDR Display should always be used. Personally (and every person I have spoken with), I always want HDR content to use the full and proper range. If someone really wants to decrease HDR brightness, typically everyone will do this on their TV/Monitor settings, so I don't understand why that even needs to be handled in kde at all.
I was strugling with the new HDR sliders a lot and after not finding a good setting that was as good as 6.2 I ended up here. I'll just sumarize things to not take much time of anyone (reason available for those who want it). Consider this a colection of good practices (much based on users feedback from reddit and this issue) 1. sRGB color intensity never worked on nVidia hw. Feature is broken for nVidia users. 2. Max SDR brightness should not control HDR brightness. This causes double tone maping and that's undesirable. Just leave that to the screen settings (as mentioned before, these screens already do the work to make HDR looking good inside their limitation). 3. If 2 must be included anyway then include an option to ignore it and leave HDR as is (also known as HGiG mode). 4. If 2 exists to controls the furute KDE UI then this must be a separated option just to control the UI and nothing else. 5. Brightness is a useless slider as is now. If the intent is to directly control the native screen backligh then it must be set to do nothing when it cannot change that. Again just leave that to internal screen settings to avoid unecessary double tone mapping reason for 2: I was going crasy because sudenly FF7 Remake had way different HDR than PS5 or Windows do (seems to be back to 6.2 state after seting to 203 nits). In a nutshell multimidia content is already calibrated to work good with 1000 nits (or cd/m² | candelas per square meter... I'll call nits because it's screen industry standard caling this way by now). Messing with that is common sense to not be a good idea. Future KDE calibration screen can and should be compliant to HGiG and how games access Windows 11 to see the limits of the screen used. This limits should start with EDID but ultimately be the result of this calibration process. reason for 3: I have an OLED C9 from LG. Quite old OLED panel by now. It suffers from bad SDR processing. I have photosensitive condition thanks to astigmatism. Still talking about multimidia content: when KDE 6.2 was around I discovered I could endure countless hours of SDR content just by setting it to 125 or 150 nits. 203 is too bright for SDR content and gives me headaches. KDE 6.3 devoided me from that option and just made things unecessary burocratic. Consider having this option back an accessibility option that will help people with my problem. Astigmatism is a very common visual problem justifying the existence of this. reason for 4: This is a productivity problem. I think it a consensus that no one want's to have to look at bright screenwhile working unless the work itself has to do with multimidia production. That's why the UI brightness must be a separated setting. Less than 1% of users will actually use the way things are in 6.3 and beyond as the majority will be looking at code, paper work or sheet work. reason for 5: This is a nice setting to have if properly control native backlight. When this setting can't control native backlight it'll cause havoc because screen aways remember last backligh level while KDE will adjust output brightness to even lower brightness causing an issue. As an example this settings is not able to control OLED Light option from my LG C9 thus making this option completly useless and potentialy harmfull. This should be disabled if proper control of native backlight cannot be atained and should not change image output at all times in that situation leaving actual backlight control to internal screen settings only. This somewhat remembers me the problem with Super + P problem. It's not configurable and just works with 2 screens. If one connects a 3rd it does nothing making the feature half broken. The must fix section: 1. SDR max brightness slider is misleading as it changes the entire range up and down from both SDR and HDR causing double tone mapping problems. Also must not control HDR in any way. Even if this is the intended design it's clearly the wrong direction to follow based on feedbacks provided until now. Because of that I'm reopening the issue. Action must be taken either to devoid HDR control from it or at least change the name and description of what this is doing. 2. SDR and HDR must be separated settings as this is tied to each person's preference and/or visual issues. (second reason I'm reopening the issue.) 3. Exclusive UI brightness control should be available just to control native KDE UI for productivity reasons. (3rd reason I'm reopening this issue) Last, but not least, no reason I described is nonsense and represents real world usage in other platforms (as another user said here: "KDE don't need to create another standard for us to deal with 2 standards. Let's just keep what it works and improve upon that). It has nothing to do with lack of HDR standard knoledge but I chose to keep language simple regardless to include as much as possible. More technical people can do the link. KDE don't need to reinvent the wheels again and should just implement what is already working (like it was in 6.2. That version just need to incorporate new options without losing the existing ones) I hope this is useful for devs to know what users expect and want from HDR on KDE.
To be frank, I've explained this multiple times already, and I just don't have the time to repeat myself for every person that shows up and brings up the exact same complaint with the exact same misconceptions about brightness and HDR, and how "most" people supposedly use screens. Please read the previous comments before posting.
(In reply to Zamundaaa from comment #30) > To be frank, I've explained this multiple times already, and I just don't > have the time to repeat myself for every person that shows up and brings up > the exact same complaint with the exact same misconceptions about brightness > and HDR, and how "most" people supposedly use screens. > Please read the previous comments before posting. I read the entire conversation. The conclusion is that KDE removed a "bug" that everyone though easy to control and used making it a feature instead. Once I worked in a place that had a defect in the product that a client relies on. So that defect got promoted to feature. It's the same here. People are asking you to keep the bug around because it's now a feature. Leting people change SDR only is a bug and not predicted by specs? At this point I would call it a feature even if outside spec. Let people use that. This is what people are asking.
(In reply to Zamundaaa from comment #30) > To be frank, I've explained this multiple times already, and I just don't > have the time to repeat myself for every person that shows up and brings up > the exact same complaint with the exact same misconceptions about brightness > and HDR, and how "most" people supposedly use screens. > Please read the previous comments before posting. "To be frank" that's absolute nonsense. I asked a question in my last comment to help clear up my "misconceptions" and you ignored it. You've barely explained anything. I only know from reading your reddit comments that you actually plan on getting rid of this slider entirely and adding a separate HDR calibration page. Why don't you communicate that information to the group of people who subscribed to specifically get more information about that? You've intentionally left everyone in the dark and then you're confused when they have "misconceptions."
(In reply to tamodolo from comment #31) > Leting people change SDR only is a bug and not predicted by specs? No, it was how the initial naive implementation happened to work. As I wrote before, it was intentionally changed to make lots of better things possible, including but far from limited to actual Linux native applications being able to do HDR without making their app unusable. > At point I would call it a feature even if outside spec. Let people use that. This is what people are asking. People here are asking for lots of different things, most of which are based on misconceptions about what things mean and how they work. Claims like HDR being only "proper" or "correct" if presented in the value of nits as the content specifies is just plain nonsense. To be clear, this is a complicated topic, the specs are terrible and confusing even for professionals, and it doesn't help that marketing companies did their very best to make HDR as much of a nonsense term as possible. So I don't blame anyone for not knowing everything about it, as long as they keep their claims about what's supposed to be "correct" and whatever minimal. But just like you're not thrilled by reading through tons of media standard documents or the literal years of development history of Wayland color management, I'm not thrilled spending a lot of time each day taking apart misconceptions about HDR, one by one, for each person commenting on this bug report. I would rather spend that time improving support for more efficient HDR video, the HDR calibration page, HDR screenshots and recording, and so much more... and bugzilla isn't even a good place for discussions in general. So if you want to continue such a discussion, please read https://zamundaaa.github.io/colormanagement/2025/03/31/about-brightness.html so that we hopefully have some shared base line of understanding, and then start a thread on https://discuss.kde.org instead (feel free to ping me there). Just like here, I can't promise that I'll answer quickly or anything, but it is at least a more suitable place for discussions, and a place where I'm much less likely to miss something, because it has an actual notification system and unread markers. And as mentioned before, if you specifically would like a feature to change the brightness of specific applications or content types (as far as that is possible to detect) rather than some arbitrary distinction between "SDR" and "HDR", that is entirely reasonable and you could make a new bug report about that. (In reply to TheFeelTrain from comment #32) > "To be frank" that's absolute nonsense. I asked a question in my last > comment to help clear up my "misconceptions" and you ignored it. You've > barely explained anything. I'm sorry, I get a lot of emails every day from this bug tracker alone, and sometimes one goes under. I assume you're referring to > Can you explain what this setting is *actually* doing then? You are legitimately the only person on planet Earth who knows how Plasma's HDR functions. All we have to go off of is what you've said here. Again, that's one of my biggest issues with this change. It's confusing. ? The two brightness sliders decide the reference luminance. Specifically it's calculated as "5 + (Maximum SDR Brightness - 5) * brightness slider value". When the reference luminance is below 203cd/m², HDR content is darker compared to Plasma 6.2 or Windows. When it's above, it's brighter. In both cases, content may go as bright or dark as the display can do, there are (currently at least) no restrictions or attempts to fudge them in either direction. Videos are played as they're meant to be, so that dark scenes are dark, and bright ones are bright, relative to the brightness you're adapted to. Depending on the video, that might use the entire brightness range of the display, or it might not. That's not a problem; just like the Batman movies rarely use the full dynamic range of even an SDR screen, the movie gets shown as intended. If you'd like to lighten it up or make it darker anyways, make that feature request about changing brightness levels of an individual window... or check if your video player has built in options for that. Games are a bit more messy, because all the current HDR games are built for Windows. They require different calibration values than on Windows... but you can effectively always configure them to go as bright (or not) as you like. Linux native games, because of how we do HDR, will "just work" using the display settings, without you needing to do anything. > I only know from reading your reddit comments that you actually plan on > getting rid of this slider entirely and adding a separate HDR calibration > page. Why don't you communicate that information to the group of people who > subscribed to specifically get more information about that? You've > intentionally left everyone in the dark and then you're confused when they > have "misconceptions." It will hopefully be a lot less confusing, but the slider in the calibration page will do the exact same thing as the old one - configure what "100% brightness" means, no matter if SDR or HDR. If you're unhappy with the current state of things, you won't be any more happy with that calibration page.
I understand all the points and will dedicate time to read more about it. (In reply to Zamundaaa from comment #33) > It will hopefully be a lot less confusing, but the slider in the calibration > page will do the exact same thing as the old one - configure what "100% > brightness" means, no matter if SDR or HDR. If you're unhappy with the > current state of things, you won't be any more happy with that calibration > page. To be more specific I really would apreciate if there's a slider dedicated only for content that are using BT.601 and BT.709. I don't know the practical of detecting this but that's the feature I miss the most.
(In reply to Zamundaaa from comment #33) > In both cases, content may go as bright or dark as the display can do This is important information that could have been communicated better. As long as the max brightness can still be reached, I don't see a big problem with this solution. I would be interested in how the mapping / Luminance curve adjustment is done, if you could point me to the relevant code and it isn't too much hassle.
So let me start off by saying that I am by no means a developer or a color expert, what I do know is that I had been using HDR quite happily on plasma for quite a few months on my OLED TV. I had not updated my KDE in a while just due to lack of time the computer would only come on for me to play some games with friends and then go right back off for another week or more. I finally decided to do updates today and now my HDR that was working perfectly and looked absolutely amazing suddenly makes my entire desktop look like the saturation has been turned down and everything is way too bright and washed out. No matter what I do to the sliders I cannot make it look good again, I can make it look less bad but no combination of the sliders has been able to return me to the absolute gorgeous and at least according to my cheap color calibrator quite accurate colors that I had. I won't pretend to have read every single comment in this thread though I did attempt to skim most of it, I would just like to remind developers to not let perfect be the enemy of good. Even if it's not technically correct as long as it looks correct that's what the end user is going to care about and they aren't going to want to have to do a ton of manual tweaking to get there. My HDR was working, now it's useless and I need to Simply manually toggle it when I want to play an HDR game or watch some type of HDR content Because the actual desktop itself is unusable with HDR enabled now. If that's because this is a transition. Then maybe give us back the old incorrect method until you're ready to fully transition. As an end user especially one with little time for entertainment these days. It's very upsetting to have something that was working and bringing me joy suddenly break and the only information I can seemingly find about it is people arguing about how color should be calibrated and luminance and this and that crap I honestly couldn't care less about. If you got this far thank you for reading my rant, hopefully it wasn't entirely useless
(In reply to Kitsuna from comment #36) > So let me start off by saying that I am by no means a developer or a color > expert, what I do know is that I had been using HDR quite happily on plasma > for quite a few months on my OLED TV. I had not updated my KDE in a while > just due to lack of time the computer would only come on for me to play some > games with friends and then go right back off for another week or more. I > finally decided to do updates today and now my HDR that was working > perfectly and looked absolutely amazing suddenly makes my entire desktop > look like the saturation has been turned down and everything is way too > bright and washed out. No matter what I do to the sliders I cannot make it > look good again, I can make it look less bad but no combination of the > sliders has been able to return me to the absolute gorgeous and at least > according to my cheap color calibrator quite accurate colors that I had. > > I won't pretend to have read every single comment in this thread though I > did attempt to skim most of it, I would just like to remind developers to > not let perfect be the enemy of good. Even if it's not technically correct > as long as it looks correct that's what the end user is going to care about > and they aren't going to want to have to do a ton of manual tweaking to get > there. My HDR was working, now it's useless and I need to Simply manually > toggle it when I want to play an HDR game or watch some type of HDR content > Because the actual desktop itself is unusable with HDR enabled now. If > that's because this is a transition. Then maybe give us back the old > incorrect method until you're ready to fully transition. As an end user > especially one with little time for entertainment these days. It's very > upsetting to have something that was working and bringing me joy suddenly > break and the only information I can seemingly find about it is people > arguing about how color should be calibrated and luminance and this and that > crap I honestly couldn't care less about. > > If you got this far thank you for reading my rant, hopefully it wasn't > entirely useless Set brightness to 100 and sdr bright to 203. This is how things were before as Dev said. But I agree. Sdr slider that controls hdr bright doen't make any sense outside the head of the person that nade it this way I think
(In reply to evea from comment #35) > I would be interested in how the mapping / Luminance curve adjustment is > done, if you could point me to the relevant code and it isn't too much > hassle. For content that ends up in the capabilities of the screen, it's here: https://invent.kde.org/plasma/kwin/-/blob/c188f5da7011a0380f846423c81686d3d9cbbe18/src/core/colorspace.cpp#L575 For when tone mapping is required, this algorithm is used: https://invent.kde.org/plasma/kwin/-/blob/c188f5da7011a0380f846423c81686d3d9cbbe18/src/core/colorpipeline.cpp#L273 Don't necessarily take it as a reference of anything though, it's not the best tone mapping curve and just what I got after a day or two of experimentation. (In reply to Kitsuna from comment #36) > So let me start off by saying that I am by no means a developer or a color > expert, what I do know is that I had been using HDR quite happily on plasma > for quite a few months on my OLED TV. I had not updated my KDE in a while > just due to lack of time the computer would only come on for me to play some > games with friends and then go right back off for another week or more. I > finally decided to do updates today and now my HDR that was working > perfectly and looked absolutely amazing suddenly makes my entire desktop > look like the saturation has been turned down and everything is way too > bright and washed out. No matter what I do to the sliders I cannot make it > look good again, I can make it look less bad but no combination of the > sliders has been able to return me to the absolute gorgeous and at least > according to my cheap color calibrator quite accurate colors that I had. That sounds in no way related to this bug report. Please create a separate bug report about your issue, with measurements (follow the verification bit in https://zamundaaa.github.io/wayland/2024/07/16/how-to-profile.html while HDR mode is enabled vs. disabled to get some hopefully useful data)
Can't it be just fixed by implementing whitelist maintained by the user? - Say user adds Cyberpunk 2077 to the list - When said gamie is running and in focus, system switches brightness to 203nits - When game stops running or is out of focus, system switches to whatever user set with the brightness slider Are there any technical reasons why it can't be done?
Windows games are a different story, and yes, at least to some degree we can add workarounds for them. Like I said before, that belongs into a separate bug report though.
I'll update this issue with some new information about what I'm experiencing: SDR contento on HDR is very good. Conversion is great and actually better than native SDR display were the TV's processing mess up with color. Native HDR was working right until recently but something I can't traceback got wrong on the way. As example, FF7 Remake HDR is producing diferent results when compared to Windows. This may be related to how screen calibration is working as it indicates that my 800 Nits OLED is capable of only 530 nits. Moving the max settings to actual 800 Nits breaks HDR forcing something similar to a bloom effect. Paper white fixed at 203 Nits. I think a passthrough option should exist to avoid double caligration in cases were the app don't relly on KDE's calibration. Or at least a global turn off option that still provides data for apps that rely on KDE's calibration instead of processing everything. This was tested on KDE 6.4.3. Any sugestion on how I can get this right is wellcome. Also, HDR worked fine before 6.3