

Literally no other desktop has this functionality…


Literally no other desktop has this functionality…
Yeah, afaik MacOS can do it too.
ICC profile is built-in in memory of my laptop screen?
Yeah. Surprisingly, nearly every display comes with somewhat accurate color information in its EDID.
Even Windows can’t do it
Afaik they did actually implement something in this direction in Windows 11, but it’s not exposed in a user friendly way yet.
There is no saturation slider, though. I’ve seen it on some screenshots.
It’s currently always shown if the built in color profile, HDR or an ICC profile is used. IIRC it wasn’t visible with the color profile in some older version of Plasma though, maybe that’s why you don’t see it.


Afaik Youtube is doing that, not Pulseaudio, and there’s nothing that can be done about it.
what about color managed apps?
Colord isn’t running and no icc profile is set on Xwayland, so they should assume sRGB as the target and be fine.
Do note that nothing is applied “to the full screen”, color management is done on each individual surface during compositing. If an app uses the color management protocol to use some colorspace, it gets taken into account.
Are you sure you can affect saturation system-wide with color profiles?
Yes, I wrote the relevant code in KWin.
Because in Gnome it’s not possible.
That’s because Gnome’s color management support is still limited, they don’t apply the full ICC profile yet.
Sounds like you want color management, not just arbitrary saturation changes.
Install KDE Plasma, select the “built-in” color profile, and you’re done, no more oversaturated colors. If you want to test how it looks, just use a live boot.


Why would you run it in Proton? It’s a native game.


Wayland as a protocol was designed around CSDs, protocols for SSDs came years later
That’s not an argument for anything. The core protocol isn’t useful on its own, you always need extensions that came later to even create a window. As another example, Wayland as a protocol was designed around shared memory buffers, protocols for hardware acceleration came later. Doesn’t mean you’re supposed to leave that out.
Modern apps tend to prefer CSDs anyway since it provides more flexibility, very common on MacOS and Windows
That too is not an argument for not implementing what a ton of apps need.
MacOS and Windows don’t do the same sort of CSD as Gnome FYI, it’s more of a hybrid approach, where parts of the decoration are rendered by the system and parts by the app.
It’s difficult to coordinate things between the client and compositor.
That too isn’t relevant, libdecor doesn’t coordinate shit either. And if you want to (which is being looked into), you can absolutely sync things with SSD too.
The actual and only reason Gnome doesn’t support SSD is that they think CSD is a “better architecture”.


Ddc/ci brightness changes are very often animated by the display firmware, so doing it fast is rarely possible.
You can however disable ddc/ci in the display settings if you’d rather have software brightness.


Afaik you very much can not turn it off
I get way more spam on WhatsApp than on Matrix. Never been invited to a fake group chat on Matrix at least…


Quite good I’d say. I don’t have other high end laptops for comparison though


Is the framework 13 really worth my money for the repairability and upgradability in comparison?
Depends on what you upgrade for, and what you need in the first place.
If you upgrade mainly for more CPU and GPU power, in my opinion that’s a hard sell. The new mainboards from Framework are hella expensive!
If you need a dGPU in a small form factor laptop, Framework just doesn’t offer that. Same for touch or built-in tablet support.
If you’re ok with the built-in GPU and upgrade for better display, for better battery, and a better but perhaps not the absolute latest and best APU, yes, it’s worth it.
When I bought the FW13, a year later or so they brought out a new 120Hz higher resolution display. The first display being 60Hz was my only big annoyance with it, having a 120Hz monitor for comparison… So I just bought the new display, and swapping it only took literal 5 minutes.
Similar story with the hinges, I wanted ones with more resistance, so I just bought stronger ones for 25€ and easily replaced them.
If the battery gets worse, or they bring out a new one with decently improved capacity, I can similarly replace it in 5 minutes.
No glue, no 10 types of special screws, just the screw driver that was shipped with the laptop, and basically zero risk of breaking anything when making modifications.
You’ll have to know yourself if these tradeoffs are worth it to you… but after my old HP Envy’s display broke and even finding the correct replacement part was a challenge, let alone replacing it, I’m quite happy with the FW13.


Then when a game is started it starts another Gamescope session which launches the game in a second XWayland session.
No, it doesn’t start another gamescope. It starts a second Xwayland in the same gamescope instance.


The screen brightness adapts automatically to the windows i focus on, which is a good idea.
That’s definitely not something Plasma is doing… Sounds like your monitor is dumb with “adaptive contrast” or just terribly implemented local dimming.


The screencast portal has been around for 7 years. How is it not enough? It is very much GPU-only (if the receiving program supports it, which nearly all do), and encoding the image is up to the app and does not depend on the API.
This sounds like a bug that was fixed some time ago - the desktop window is stealing focus when it gets created, so every time the display reconnects to the PC.
Because you’re on Debian with Plasma 5.27.5, you don’t have that fix though.


I don’t actually believe this to be the case, if it was people who use custom ICCs would get extremely wonky results that don’t typically happen
They wouldn’t, because applying ICC profiles is opt-in for each application. Games and at least many video players don’t apply ICC profiles, so they do not see negative side effects of it being handled wrong (unless they calibrate the VCGT to follow the piece-wise TF).
With Windows Advanced Color of course, that may change.
I think I am a bit confused on the laptop analogy then, could you elaborate on it?
What analogy?
How monitors typically handle this is beyond me I will admit, But I have seen some really bonkers ways of handling it so I couldn’t really comment on whether or not this holds true one way or another. Just so I am not misinterpeting you, are you saying that “if you feed 300nits of PQ, the monitor will not allow it to go above it’s 300nits”? IF so this is not the case on what happens on my TV unless I am in “creator/PC” mode. In other modes it will allow it to go brighter or dimmer.
Yes, that’s exactly what happens. TVs do random nonsense to make the image look “better”, and one of those image optimizations is to boost brightness. In this case it’s far from always nonsense of course (on my TV it was though, it made the normal desktop waaay too bright).
unless I am in “creator/PC” mode
Almost certainly just trying to copy what monitors do.
With libjxl it doesn’t really default to the “SDR white == 203” reference from the “reference white == SDR white” common… choice? not sure how to word it… Anyways, libjxl defaults to “SDR white = 255” or something along those lines, I can’t quite remember. The reasoning for this was simple, that was what they were tuning butteraugli on.
Heh, when it came to merging the Wayland protocol and we needed implementations for all the features, I was searching for a video or image standard that did exactly that. The protocol has a feature where you can specify a non-default reference luminance to handle these cases.
It is indeed the case that users wont know what transfer function content is using. but they absolutely do see a difference other then “HDR gets brighter then SDR” and that is “it’s more smooth in the dark areas” because that is also equally true.
That is technically speaking true, but noone actually sees that. People do often get confused about bit depth vs. HDR, but that’s more to do with marketing conflating the two than people actually noticing a lack of banding with HDR content. With the terrible bitrates videos often use nowadays, you can even get banding in HDR videos too :/
When you play an HDR and an SDR video on a desktop OS side by side, the only normally visible differences are that the HDR video sometimes gets a lot brighter than the SDR one, and that (with a color managed video player…) the colors may be more intense.
There’s no icc profile for it, it’s just read from the EDID