Intet products were using temporal dithering (M1/M2/M3/M4), but for a while you were able to use Stillcolor or BetterDisplay to disable it.
They all apply GPU dithering - every chip generation. But Stillcolor is confirmed via terminal to disable it. M3 was the latest released when Stillcolor was released, so most of the capture card confirmation is from M1-M3 chips. To my knowledge no one has tested an M4 or M5 Mac via the capture card method outlined in the Stillcolor thread.
I should have been more clear that there is an assumption Stillcolor isn’t working on M4 and M5 because it doesn’t seem to alleviate symptoms. I think it’s likely Stillcolor still disables dithering, but there is a difference in the internal panels themselves or additional processing is being applied. Again, no one has tested this - we are just going by symptoms.
I theorized that the gray color flicker was behind the symptoms because it is worse on M4 MacBook Airs, which because they lack the same aggressive PWM as the Pros, were tested by more users here.
Intet But since then compatibility was broken with subsequent MacOS updates, while it still works on Intel Macs. This info might be wrong, I am just telling you what my information was to this date.
Yeah, we don’t know if Sequoia and Tahoe stop Stillcolor from working. No one has tested this. Stillcolor was originally tested on Ventura and Sonoma
Intet The monitor can run in native color mode (8-bit) and I had it successfully doing so on Windows and Linux. But whatever I did on my Mac Studio it would always force a 10-bit (max possible) output to the display.
So even with BetterDisplay disabling GPU dithering and selecting 8-bit output, FRC was still enabled by your monitor on Mac only?
Intet But with my new Monitor that only supports native 8-bit it outputs 8-bit on my Mac Studio to this display. The MacStudio does not inject its own temporal dither into the display out via its gpu, it just automaticially asks for temporal dither if the screen it is attached to reports it has that ability.
I’m not completely following you. By temporal dither you’re referring to monitor FRC, correct? I’m assuming you’re still disabling GPU dithering and sending an 8-bit signal via BetterDisplay?
Intet But I assume from your writings that more likely the screens in the M-line MacBooks just all support temporal dither.
Every screen is 8-bit+FRC on the MacBooks and Studio Display. I’m not sure what the M1 MBA and M1/M2 13” MBP are using because they have the older Retina screens and are marketed as being capable of “millions of colors” internally but also P3 wide color.
This is where I’m trying to draw a conclusion based on older Intel Macs with the same specs as to whether they’re achieving P3 wide color using solely GPU dithering, or whether the internal panel is also using FRC
Intet here is no "pseudo 8-bit" to my knowledge if 8-bit was not supported at all, the image could not be displayed. 8-bit and 10-bit color outputs are different formats which are not backwards compatible. Relevant here but besides the main point: Theoreticially some screens upscale all 8-bit signals to 10-bit, often via FRC.
This is what I’m basically trying to understand: does BetterDisplay actually deliver an 8-bit signal when selected, or is MacOS just always output at 10-bit and if the monitor is only 8-bit capable, then it’ll just get bottlenecked down to 8-bit as long as Apple Silicon GPU dithering is disabled?
I’m assuming BetterDisplay only gives you an 8-bit output option for your true 8-bit external display? But if it does offer both 10 and 8 bit output options within BetterDisplay, you could test to see if leaving it on 10-bit makes a difference.
Sorry for all the questions, but I actually think the answers are relevant in regard to the proper protocol we should direct users to follow to get a clean signal from a Mac.
It also kind of would confirm (or at least heavily imply) that this is why the user who posted here a few months ago that they had a clean signal from an M1 MacBook Air but when using an M1 Mac Mini on the same version of MacOS with the same cable and same BetterDisplay output options on the same monitor, they had symptoms.
The reason seems to be that the older Retina “millions of colors” laptops actually have an 8-bit output option whereas the Mac Mini only has 10-bit. It would be an example of a difference in the display pipeline likely carried over from the TCON component and firmware from the old Retina Intel era technology.
This in theory would not mean that you can’t achieve 8-bit on the new Macs, but rather that it will only work on a true 8-bit external display and possibly only on specific chips and OS versions.