languidicity Or is the working hypothesis that the dithering is done solely by the GPU and the panel's are native 8bit with no FRC? Which, again, is surely unlikely.
We know Apple has a GPU temporal dithering system available for external displays. So it isn't necessarily a stretch to say they will want to use the same system on the native displays. But proving that is still in the air.
One theory is that if we can prove Apple never sends more than 8-bit color to the display, dithered or not, it doesn't matter if the display has temporal dithering built-in. An 8-bit signal should pass through to an 8-bit native display untouched, even if that TCON has dithering. But just answering that question is taking time.
Muddying things for me is that I still can't say for certain whether my MacBook Pro Display is native 8-bit, or native 10-bit. Not knowing that for certain adds doubt to any experiment I come up with. Apple exposes 8-bit as my native display depth, but meanwhile the private API's seem to report a possible 10-bit mode. Turns out I'm finding it tricky to tell if a display is native 10-bit or 8-bit+dithering, if it isn't in the specifications.
photon78s Perhaps, I have been recording internal and external monitor FRC with the scope and not the dithering from the software side? That is another explanation.
The big problem for me and the microscope is the brutal LCD Polarity Inversion Bias. But I do think the microscope will have the answers. I spent some time looking at the dithering pattern that Apple is using. It is handy to know that it seems like they have four different blue-noise patterns that they flip through frame by frame, and then repeat. There is some slight additional white-noise layered on top of that, but the main effect when looking at one pixel is that each pixel will flash through four colors, and then repeat that pattern of four colors forever. I measured this by zooming into the input capture videos. You can actually clearly see the dithering pattern as it affects colors in the pixels of the original input video, if zoomed in, frame-by-frame.
Measuring the RGB values of the noise shows that each sub-pixel will increase by a maximum of one 8-bit integer. It is interesting that the dithering occurs at 8-bit, considering @aiaf captured the noise at 10-bit.
I also know that on my MacBook Pro, the polarity inversion pattern is a single pixel checkerboard. It simply alternates every frame. Importantly, this means all sub-pixels will flicker up and down in parallel with each other if they are part of the same pixel. Whereas with dithering, each sub pixel can go high or low independantly.
I think armed with this knowledge, the next time I get a chance to look under the microscope I will have a better chance at noticing the difference with Stillcolor. Just hopefully the TCON isn't going to layer even more noise on top of all this!