aiaf really good to know, does this mean it's reccomended to leave VUCEnable at system default, because trying to disable it would actually introduce more flicker?
I disabled dithering on Apple silicon + Introducing Stillcolor macOS M1/M2/M3
NewDwarf so the way I understand DisplaysShouldNotBeTVs is that there's either another IC exclusive to the built-in panels which does dithering on the data received from the DCP. Or there's another piece of code in the GPU/DCP pipeline that does dithering for the built-in panel regardless of what enableDither
is set to. Or some kind of color/brightness/voltage/temperature compensation/correction exclusive to the built-in panel has some dithering as a side effect. Time will tell.
- Edited
DisplaysShouldNotBeTVs for my single experiment, leaving it on is better. But I have to repeat this measurement multiple times to really verify. What do you notice in the pattern/tone of banding when VUC is disabled?
(Edit: I meant leaving is on. Leave VUCEnable = true as is)
- Edited
aiaf interesting, actually very surprised if it's a true 10-bit panel
do you suspect the M2 MacBook Air with "support for billions of colors" is also a true 10-bit panel?
given that in comparison, the M1 MacBook Air specs say the internal screen only supports "millions of colors", despite macOS still forcing desktop bit depth to 10 bpc on both Airs
on the other hand, the other possible scenario is the M2 Air uses FRC on the panel to achieve "10-bit", like how some 10-bit capable external monitors work, and the M1 Air simply chops off the last 2 bits of precision
aiaf the Asahi Linux project has made giant strides in reverse engineering the DCP interface, and have written clients/drivers for it https://github.com/AsahiLinux/linux/tree/asahi/drivers/gpu/drm/apple
I didn't notice that this project has something related to dithering.
- Edited
aiaf i notice the banding gets more significant/less precise on some transparency effects and shadows, especially when Software Brightness is lower. (BetterDisplay feature, controls brightness of the color profile/gamma table for reference.)
there are also some very specific shades of gray where when VUCEnable is default (on), my brain will feel like they are not totally solid and have some blurry reddish blotches. when VUCEnable is off, this will become obvious — the parts i suspected were more reddish will have more obvious reddish "blocky irregular pattern of larger squares" filling those areas. you can find these shades by slowly moving the Software Brightness slider while looking at a solid gray background, some levels will cause this pattern to immediately appear.
(i say "more obvious" but you still have to look closely to see them, it's just that an actual pattern of squares that always remain at the same position is visible now)
i did notice and do agree that leaving VUCEnable at default (on) actually did make me feel the best, despite the increased banding with it off. just disabling dither and uniformity2D seemed to create the most comfortable screen so far (relatively, still a lot worse than other laptops lol). i did feel like i noticed some strange flicker with VUCEnable off.
NewDwarf their tracer does make note of it though https://github.com/AsahiLinux/m1n1/blob/3b9a71422e45209ef57c563e418f877bf54358be/proxyclient/hv/trace_dcp.py#L582
DisplaysShouldNotBeTVs yep I agree, better leave it on. We don't even know what VUC means.
- Edited
https://ledstrain.org/d/1111-external-graphics-egfxegpu/8
From @Seagull
When turning dithering off you'll only see the difference on a laptop screen. Desktop monitors do their own dithering in addition to any graphics card dithering. Laptop screens are dumb, and rely on the graphics card to dither for them.
Emphasis mine.
But then this is Apple...
DisplaysShouldNotBeTVs I think MBP is using 10bit display, without FRC, according to specs from Apple. I know that Apple XDR Display is true 10bit, thats also mentioned on their website. I know that their regular Apple studio display is using FRC the same as MBA.
no it's a big difference
macOS scale down
Windows scale up
try to disable "retina" mode in macOS and tell me if the text is more gentle and easier to read despite being super blurry
madmozg I think MBP is using 10bit display, without FRC
On my MacBook Pro I do see a change with Stillcolor in that it seems to reveal quantization artifacts. Waves of magenta and cyan. Seems they have kept some level of dithering in place to further boost the display beyond 10 bits and manage those waves. It still isn’t clear if this is temporal or spatial dithering.
Thanks for those screenshots. It really seems to follow that “support for X colors” is Apple’s marketing for FRC dithering.
Blooey I also saw it with MBP 14', dont know why. Looks like software tried to send 8bit signal, but GPU of apple silicon still using dithering or something like that.
"Support billion of colors" words used by some manufacturers like LG SAMSUNG ASUS and etc. You can check their websites and official specifications. In monitors manuals they will show you what type of display, 8bit, 8bit+FRC or true 10 bit, its easy, but I think not that easy with Apple. I wonder if this specification type forced to have by US regulators or so? And why apple don't have this kind of described documentation what kind of display etc used in their devices..
For example asus specifications for their monitors: