It turns out that the minimum 8-bit limitation for HDMI on my Intel UHD Xe (12th gen CPU) is not a hardware issue; it's a driver limitation in both Linux and Windows. In Windows, I was able to force a switch to 6-bit mode over HDMI by writing a specific value to the video controller pipeline state register. The following bits determine the bit depth (the rightmost bit indicates this mode, besides 8 bits, zero is used there):
- 00000100000000000000000001000000 - 6-bit
- 00000100000000000000000000000000 - 8-bit
- 00000100000000000000000000100000 - 10-bit
- 00000100000000000000000010000000 - 12-bit
For my hardware, the value for 6 bits is 0x4000040, and it works for HDMI connection.
P.S. To enable 6 bits over HDMI, I assigned the variable 'New' the value 0x4000040, which was written to the register at this line: https://github.com/skawamoto0/ditherig/blob/master/ditherig/ditherig.cpp#L630C60-L630C63 (this is from the original ditherig app code). I will post the modified source code later in case anyone else wants to experiment.
Here are photos showing the screen in both 6-bit and 8-bit modes over HDMI. Banding is visible in the 6-bit mode:
https://ibb.co/qsGgD4Q
https://ibb.co/0Mjp5Ry
P.P.S. If it works for me, it will most likely work on other generations of UHD as well, although I can't say exactly which ones, since I only have Intel UHD Xe (12th gen CPU) for testing. I'll try to ask friends to find other UHD models for testing.