Some notes from my research
From my understanding of GPU drivers, the "output stage" (which includes dithering) handles different display outputs (internal LCD, DVI, HDMI, etc) separately.
For each graphics chipset type (AMD, Intel, NVIDIA) a different method needs to researched and developed. Might also depend on the model of the GPU in question
Many Linux graphics drivers are open source, so while the code wouldn't be useful directly, it kind of serves as a "blueprint" of what needs to be done and how to do it.
For what it's worth. I realize some people complain about the display output on Linux, but right here. I have a workstation with a AMD Radeon Pro graphics card under Linux, and I can confirm that by default, no temporal dithering is used whatsoever, either in BIOS or the running OS.
Seagull Step 1, get someone with a capture card to confirm that these devices are dithering, and who can test the finished result to ensure it works.
For an external display output, that's easy points under desk. For a laptop display, given how the output stages are different, you're either looking at disassembling the laptop screen (Not something you'd want to do) or possibly use a high speed camera (expensive)
- The elephant in the room. Appleisms.
Over time. macOS is becoming more of a "black box", and while the M1 hardware is interesting in some ways, it also presents some new challenges here, seeing how it uses a (somewhat custom) GPU, based on what the iPhone uses.
I do worry, that with Apple's transition to ARM, that at some point in the future x86 Mac's are going to be left in the cold with a lack of updates, causing them to be the new "dinosaurs" we don't want to be stuck using.
Best bet here is that if the Linux porting efforts are "usable", we could potentially figure out what it's doing differently. But that's still some time (1+ years) away.