aiaf I asked on the MacRumors forums one person said it's all 8bit+FRC except on the Pro Display XDR.
It may not be a fair suggestion, but this makes it seem like the Apple marketing department is involved. Discovering the bit depth of a monitor for ourselves should not be this convoluted. For one, we know they make no mention of FRC in their specifications. And if Apple is using 6-bit+FRC panels in a product, I could see the motivation to hide that spec. Or they may hide that property if they want the Pro Display XDR to appear like the sole gateway to wide color. This may not be true, and I hope it isn't.
I spent some time trying to simply print out the supported sample bit depths of my displays. I found that every method, function, and property that at one time did provide a display bit depth has now been deprecated. Maybe someone has had more luck. But if not, that missing public API is, unfortunately, somewhat telling.
aiaf Having an intermediary 10-bit dithered step is possibly a physical limitation imposed by bandwidth, computational, or power requirements. A 64-bit (or even 40-bit) 4/8/6K signal at 60Hz is a ton of data without compression. There are also concerns of driving multiple displays.
To perform alpha-blending, Core Animation is capable of operating on and retaining a 32-bit per component, floating point texture, with alpha. Which is to say, Metal can handle a 128-bit floating point buffer. Every CALayer is associated with one NSWindow, and every NSWindow is associated with one NSScreen. So, as I see it, it would be trivial to simply dither that buffer down to an arbitrary display buffer at whatever bit depth the display or cabling supports natively. I still don't see the need to dither once to 10-bit, and again to 8-bit. That would only require more bandwidth and computation. It can all presumable be handled once by Core Animation, with better results. The total resulting buffers are smaller, with higher quality dithering right from the original 32-bit buffer.
I just wonder if we can come up with an experiment that proves FRC vs native bit depth. It would put the concern about missed layers of temporal dithering to rest. My ladder test at least gets us to what the system supports. But I haven't yet come up with something to isolate for FRC.