aiaf This is remarkable work! So much easier to decipher than Lagom.
👍 glad it's working!
aiaf What's interesting is that on my MBP M3 Max (P3-1600 nits preset at max HW and SW brightness), with Stillcolor on (enableDither = No), I can see the 9-bit bands clearly twice as large as the 10-bit bands. So this confirms at least a "native" 10-bit image, likely dithered by the TCON. However, when I turn dithering back on, the banding on the 11-bit section becomes a lot smoother, and zoomed in, the bands are half the width of those in the 10-bit section.
I think this gives some credence to the double dithering approach, or DCP dithering -> TCON reverse temporal dithering as I will clarify later. This would be the case if these MBPs do not use true 10-bit panels.
From what I understand, this would also happen if the panel is native 10-bit. With enableDither=false, the display would receive a non-dithered, truncated or rounded 10 bit signal, and not dither. On the test ladder image, you would then see new color divisions up to a max of the 10-bit gradient level, natively displayed. Then, with enableDither=true, the display still receives a 10-bit signal, but it has now been color dithered by the GPU or CPU from the original 16-bit value. Because the GPU and CPU have access to the original 16-bits, the dithering now pushes us up higher on the gradient ladder. You now see the 11-bit level, simulated.
Theoretically, how much higher bit-depth you can simulate will increase with higher screen area of the same color, and higher frequency refresh rates of the display. And also, counter-intuitively, the slower the response time of the liquid crystal, the higher the dithered color depth. In that case, the inertia of the liquid crystal can average between two input levels. And Apple has anecdotely been said to use that strategy.
It gets complicated trying to determine if two systems are color dithering. Given if one system has color dithered, the second system in the pipeline should have no effect at all to the signal. And if one system color dithers the original 16-bit value, it is not possible to further increase the bit-depth of the result. No matter how many layers of dithering are applied. As I see it, the algorithm behind color dithering is all or nothing.
What's interesting for me is that you are seeing a change in the bit-depth with Stillcolor. I don't see that on my M1 MacBook Pro 16". Or with my external OLED 10-bit (maybe 8+FRC) display. The highest gradient ladder level with new divisions remains at the 10-bit level for both displays, with or without Stillcolor.