So interestingly ChatGPT thinks the following of how MacOS decides what signal to send to an external monitor and if it enables GPU side dithering. This is apparently informed by what the EDID tells the host as far as the monitor capabilities.
6 bit + FRC - 8 bit signal sent from Mac OS. no GPU dithering, panel uses internal FRC to approximate 8 bit color.
8 bit (True 8 bit) - 8 bit signal sent from Mac OS. No GPU dithering, panel does not use FRC. Panel displays 8 bit color natively.
10 bit (8 bit + FRC) - 8 bit signal sent from Mac OS with temporal dithering done at the GPU level. Panel uses 8 bit mode, but dithering on the GPU side simulates 10 bit color. The reason given for this is that MacOS doesn’t trust 3rd party monitors to do FRC consistently, so for the sake of consistency it’s done on the GPU side essentially bypassing the external monitors own FRC implementation but fulfilling the “10 bit capable” promise.
10 bit (True 10 bit) - No GPU dithering, panel does not use FRC. Panel displays 10 bit color natively.
With respect to their own built in displays, Apple supposedly trusts their own FRC and TCON so it sends a 10 bit signal to the built in display and let’s the panel use FRC with the TCON to simulate 10 bit color. This is in contrast to 3rd party external displays where for consistency sake they do the dithering on the GPU side and send an 8 bit signal with temporal dithering.
I asked for references and it vaguely pointed to example code or code in the Darwin display drivers and references to WWDC sessions and documents.
Chat GPT is known to be wrong about things at times so idk how accurate this is. Although “not trusting 3rd party FRC and doing it themselves for consistency sake on the GPU/OS side” would be on brand for Apple. It’s a bit counter intuitive, however.
Can anyone who’s actually tested these combinations comment on the accuracy of this?