photon78s
This patent initially filed in 2016 by Apple explains how they utilize different techniques for SDR -> HDR. Interestingly this paragraph seems to indicate the use of dither to emulate HDR on SDR screens, especially in low-light and low power conditions:
Embodiments may also be applied to maintain SDR content in poor ambient conditions. Using embodiments, standard or SDR displays may produce HDR results when the HDR content is compressed and an appropriate dither applied, especially when viewed in a dim environment such that the user's perception is dark adapted. In addition, by dynamically adapting a display panel to different environments and ambient conditions, embodiments may use less backlight in some viewing environments, which for example may save power on mobile devices.
Later on in the patent Apple references the video pipeline saying:
The video pipe 214 may perform various processing tasks or content processing techniques on the content including but not limited to noise/artifact reduction, scaling, sharpening, and color processing. In some embodiments, a frame rate conversion 216component may convert the output of the video pipe 214 to a higher frame rate by generating intermediate frame(s) between existing frames. Converting to a higher frame rate may, for example, help to compensate for judder that may appear in HDR video, for example due to flicker being more evident at higher brightness levels. Output of the frame rate conversion 216component may be fed into a display pipe 220that may perform various processing tasks including but not limited to scaling, colors space conversion(s), color gamut adjustment, and local or global tone mapping. A display backend 230may then perform additional processing tasks including but not limited to color (chroma) and tone (luma) adjustments, tone mapping, backlight or brightness adjustments, gamma correction, white point correction, black point correction, and spatio-temporal dithering to generate display content 232 output to a target display panel 240. In some embodiments, the display pipeline 210 may include a compositing218 component that composites or blends other SDR or HDR digital information such as text or UI elements with the rendered HDR content. In some embodiments, the display pipeline 210may convert the HDR content into a linear color space (e.g., a linear RGB or YCC color space) for compositing.
And it just goes on and on:
In some embodiments of an HDR rendering and display system, the rendering and/or display pipelines may employ dithering techniques, for example a stochastic pre-quantization dither, when or after applying any of the various pixel scaling, compression, or mapping techniques as described herein. Applying dither after mapping or scaling content may help maintain the appearance of the original, non-scaled content, and may reduce or eliminate banding or other artifacts that may result from the mapping techniques.
Most of this is over my head. I’m sharing mainly because I think it’s relevant to M5 and MacOS 26. There’s been debate on these forums whether Apple is unaware of those of us who have issues with these devices. It’s very clear the engineers are utilizing a ton of techniques to push HDR on SDR screens and to really try to smooth out gradients and video using dithering and other techniques. Some of this seems to be an attempt to mask or cancel out flicker…but I don’t think it’s working.
This would coincide with the lead up of the release of the Touchbar MacBook Pros and a real shift toward the new generation of Macs and MacOS. I believe this forum was founded around 2018. The timeline seems to add up.
So then the question becomes: what is it that seems to be intensified with each chipset since M1 and each OS since Big Sur that is affecting more and more people?
The display pipeline is very complex. They’ve got a lot going on an I doubt dithering is the only problem, though probably a huge factor along with other flicker. In my mind this could hint as to why the panel lottery seems to play such a big role. There are so many opportunities for unwanted flicker, artifacts, and harmful display elements to go uncontrolled. If you get a bad screen that has transistor leakage or something wasn’t aligned properly, it’s conceivable that could cause a cascade of problems downstream in such a complex display pipeline.
I dunno. I’m just spitballing here. Someone like @JTL or @aiaf have some way more research into this than I.