It's 8 bits all the way down
While rummaging through logs, I thought I'd search for DCP messages to help figure out what color bit depth we're using internally, and lo and behold, the entire DisplayPort link setup and training is on show. Here's the relevant part:

The command to produce the above output is
log show --last 3h --info --predicate "processImagePath CONTAINS 'kernel' AND senderImagePath ENDSWITH 'DCP'"
Change 3h
to the period you last turned on your Mac or changed resolution/framerate.
We can see Apple using a proprietary 7.992Gbps lane rate (just a tad shy of HBR3 8.1Gbps). At 4 lanes, this gives us a total link bandwidth of 31.968Gbps, or 25.5744Gbps usable bandwidth (DP has around 20% overhead).
We can also see that DSC is not used, and that the total video bandwidth is 25011120000bps, which barely fits within the usable bandwidth limit to transmit a maximum 3536x2456 at 8bpc and 120Hz.
This means it's impossible to transmit 10bpc at 120Hz at the required resolution, because that would require 31.264Gbps of usable bandwidth, which is not possible on HB3 rates, let alone Apple's 7.992Gbps.
Also the validateVideo
function already tells us we're using 8bpc.
So GPU/DCP dithering makes a ton of sense now, and it's to work around bandwidth limitations. Apple cannot send 10bpc at 120Hz to the TCON, so it engineered the internal display pipeline to use 8bpc at all times, and by sending pre-temporally dithered frames to the TCON, it can claim 1 billion colors and 120Hz (ProMotion).
The 8 bits video input definition we saw on the EDIDs also confirm these findings.
Negotiated color bit depth
The above logging trick also works for external displays, and it's useful for showing the negotiated color bit depth.
log show --last 3h --info --predicate "processImagePath CONTAINS 'kernel' AND senderImagePath CONTAINS 'DCPEXT'"
Again, replace 3h
with the appropriate time period.
Here's the color bit depth I see on the external display when I connect the low bandwidth HDMI cable

It appears to select a color and timing mode to fit within the available bandwidth. AllRez and BetterDisplay do not report this "final" mode.
VS when I connect using a recent DP cable

What about TCON dithering?
There's a lot of questions now, such as whether the TCON is actively using its dithering system. The LUT config payloads from the ipsw packages are proving difficult to reverse engineer. Here's one of them

This kind of looks like nothing to me? Sequential 4-byte segments that are almost identical. Needs expert eyes to decipher for sure.
Additionally, what about @Blooey's gradient image? Why do we see fine gradations above the 8-bit section? Spatial dithering? Gamma trickery? Do we need to revise our assumptions about native 8-bit displays? Additional dithering from the DCP/GPU?
Whatever the case is, I think there's very little doubt that the panel receives an 8-bit signal.