madmozg counterpoint: I have a 2015 retina macbook pro 15", so way before the 2019, but it also has Intel Iris Pro and AMD graphics. The dithering on it is really bad, shimmer everywhere, can clearly notice solid-color backgrounds moving, despite being all the way back from 2015.

I have even found a new AMD dither disable method that works on the 2015 that does improve external monitor output. However, even though it also affects the internal display in some way, there is still moving shimmer.

I'm pretty sure this is coming from something macOS itself is additionally doing, since even though I've figured out how to get the AMD graphics card to disable its own dithering (which seems to specifically concern color profiles / gamma tables, because both of those cause much stronger banding after applying the method), the IOKit registry still says "automatic" dither mode and "dith=true", as I've mentioned in a previous post in this thread. I assume these two are related to extra processing by macOS itself for the internal display.

FWIW, my 2015 Pro is on Monterey 12.6.8.

So there was definitely macbooks before 2019 that had dithering, especially the ones with AMD graphics (although on mine I can still see shimmering even in integrated intel mode too, although it is a bit less than in AMD mode)

DisplaysShouldNotBeTVs macOS provides no way to truly force a lower color depth on Apple Silicon, that's why it would be a big deal if Stillcolor is able to gain this feature.

Did you look at the EDID direction?

Probably, changing the EDID value can tell to the graphics card to enforce using of the 8-bit mode.

    Got stuff to run. Experimenting with different options. Seems like most stuff doesn't really affect anything noticable, but at least they can be changed. BacklightMatching is interesting, but I didn't manage to change it as it is a dict and this is a bit new too me. ChatGPT / Claude can give a fairly good guestimate on what different things are.

    "BacklightMatching" = {"IOPropertyMatch"={"backlight-control"=Yes}}

    Some output from ChatGPT

    Let's illuminate the path by highlighting the cryptic runes that might hold the power to alter the very fabric of visual presentation, based on their assumed roles in this arcane machinery:

    Dithering and Flicker Management:

    • enableDither: No (A clear toggle for dithering, turning on or off could influence visual quality and potentially affect flickering.)

    • uniformity2D: No (May relate to screen uniformity adjustments, possibly involving dithering processes.)

    • enable2DTemperatureCorrection: No (Temperature-related corrections could impact display properties, possibly affecting flicker.)

    • IOMFBDigitalDimmingLevel: Various values (Controls the dimming level; digital dimming techniques can influence flicker.)

    • enablePixelCapture: No (While not directly related, capturing pixel data might involve processes that can influence screen rendering and flicker.)

    • IOMFBBrightnessCompensationEnable: Yes/No (Adjusts brightness dynamically, potentially influencing flickering behaviors.)

    • IOMFBBrightnessLevel: Various values (Direct control over brightness can impact the visibility and perception of flicker.)

    • IOMFBBrightnessLevelMA: Various values (Likely related to manual adjustment of brightness levels, affecting flicker perception.)

    • IOMFBBrightnessLevelIDAC: Various values (IDAC adjustments relate to current control for brightness, impacting flicker.)

    • IOMFBTemperatureCompensationEnable: Yes/No (Temperature compensation can alter display characteristics, potentially affecting flicker.)

    Advanced Control and Optimization:

    • DisableDisplayOptimize: 0 (Disabling optimizations might reveal underlying flicker or dithering artifacts by removing compensations.)

    • enableDarkEnhancer: No (Enhancing dark regions may involve adjusting dithering or dimming strategies to reduce flicker.)

    • PCCEnable: Yes (Pixel Compensation Curves can affect how visuals are rendered, possibly influencing dithering and flicker.)

    • enableLinearToPanel: Yes (Adjustments between linear light and panel display can involve processes that affect flickering.)

    • enableDefaultTemperatureCorrection: No (Indicates if default temperature correction is used, which can affect visual output and potentially flicker.)

    Display Performance and Quality:

    • DispPerfState: 0 (Performance states can affect how content is rendered and displayed, including flicker management strategies.)

    • IOMFBSupportsICC: Yes (ICC profiles control color management, which can indirectly affect visual rendering processes related to dithering and flickering.)

    • IOMFBSupports2DBL: Yes/No (2D backlighting support might influence how backlight adjustments are made, affecting flicker.)

    Debug and Testing:

    • DebugUInt32: 0 (Generic debug parameter; its role in flickering/dithering is uncertain without further context but could be useful for testing.)

    • IOMFB Debug Info: {} (Contains debug information; could reveal insights into flicker and dither control mechanisms during troubleshooting.)

      I actually had some situations where after thinkering the backlight wass off reopening the lid. So it seems like some things might require more than just changing the value.

      async Awesome! Keep experimenting, are you measuring things subjectively or are you doing microscopy? Personally I'd take anything ChatGPT spouts about these props with a grain of salt. I think brightness compensation is something we should spend more time on.

        DisplaysShouldNotBeTVs macOS provides no way to truly force a lower color depth on Apple Silicon, that's why it would be a big deal if Stillcolor is able to gain this feature.

        Suppose, the temporal dithering is off. But the graphics card continues to put 10 bit color values in the framebuffer.

        The questions is who implements down scaling of quantization from 10 bit to 8 bit? Graphics card or display's controller?

        In any case, down scaling (10 bit -> 8 bit) happens and regardless of what hardware does down scaling, true 8 bit color is displayed. Taking this into account, the second question arises - how this affects physically the human eyes/brain?

        Whether I correctly understood, you claim that 10 bit color output of the graphics card passed to the 8 bit monitor affects somehow human eye?

        • aiaf replied to this.

          aiaf Personally I'd take anything ChatGPT spouts about these props with a grain of salt.

          I believe the only "*.color.*" and "*.dither.*" keywords affect the "screen quality" for the temporal dithering sensitive persons.

          It will be more productive disassembling of the Apple's silicon framebuffer driver.

          • aiaf replied to this.

            NewDwarf @DisplaysShouldNotBeTVs suspects there's a 2nd layer of dithering that happens before the buffer gets displayed on the built-in panel. I personally do not incline towards that theory (but seriously all bets are off with Apple- they love to dither) . I'm starting to suspect that my MBP 16" Max display is true 10-bit.

              NewDwarf VUCEnable when turned off (it's enabled by default on built-in displays) has a distinct effect on banding and grays in particular. It gives grays a reddish tint. Under a Carson pocket microscope at 60Hz refresh rate, I noticed a very obvious flicker on the red LEDs every 2 frames for 2 frames when VUCEnable was off (also dithering was off). I don't know what that means. Pixel inversion? Why reds? And why 2 frames of high luminosity followed by 2 frames of lower luminosity (half the refresh rate)? It's difficult to repeat this observation because of how fragile and inexact this microscope+phone setup is.

                aiaf idk if macos has these like iOS (assuming so) but are you using true tone/night shift/ dark mode/ reduce white point ? Those all can cause pixel flicker.

                • aiaf replied to this.

                  aiaf check with the microscope having dark mode off and on. The admin @ flickersense.org was noticing pixel flicker with dark mode enabled on iOS

                  • aiaf replied to this.

                    jordan there was definitely dithering with Dark Mode UI elements when enableDither was on. Captured those with the HDMI capture card (See this diff frame notice the menu bar and the dithering pattern in effect). But I do not trust a pocket microscope+phone setup to reliably capture such minute changes in intensity.

                      aiaf

                      I guess the SDI is only possible way right now to get true lossless capture (not the elgato cards).

                      Never mine, I see. Currently, I am using a highly lossy cheap hdmi capture device not to measure but as way to reduce my symptoms when using a dithering windows 11/nvidia setup simply by not faithfully reproducing the dithering patterns when viewed through OBS running on a safer or less dithering secondary device.

                      aiaf suspects there's a 2nd layer of dithering that happens before the buffer gets displayed on the built-in panel.

                      This is not clear for me… The dithering, by definition, is something that has only effect (on human eye) when light is emitted by the pixel/sub-pixels.

                      You already proved objectively (by the Blackmagic recorder) that dithering is totally gone when dithering is disabled.

                      If some kind of dithering had been presented, we would have noticed it on the diff video. It doesn't matter how often sub-pixel flickers, the difference will happen on the static picture sooner or later.

                      …BTW I also ordered the blackmagic recorder toy. The problem is I don't have any Apple silicon laptop.

                        NewDwarf the Asahi Linux project has made giant strides in reverse engineering the DCP interface, and have written clients/drivers for it https://github.com/AsahiLinux/linux/tree/asahi/drivers/gpu/drm/apple

                        macran has hours-long streams where he does this https://www.youtube.com/@marcan42/search?query=DCP it's awe-inspiring really

                        I already have a theory about forcing a bit depth but I will share it in due course.

                          NewDwarf video capture card can only ever record the processing that macOS uses for external display output. it can't record whatever changes are occurring for internal display.

                          for example, uniformity2D is a parameter that controls a "fade to black" effect at the edges of specifically the internal XDR display. when toggling this, you can clearly and obviously see the brightness of the screen edges changing on the MacBook screen itself.

                          however, this parameter will not change anything for external display output including any HDMI/DisplayPort capture card recording. (even though capture cards are a good way to measure the quality of external monitor output, and still provide way more information about how external video signals are being messed with than e.g. a basic QuickTime screen recording which will tell you nothing)

                          for example, trying to tell external output to turn uniformity2D on won't make a black fade ever show up on an external monitor, the property only ever controls the internal miniLED panel, which can't be captured through video out.

                          aiaf really good to know, does this mean it's reccomended to leave VUCEnable at system default, because trying to disable it would actually introduce more flicker?

                          • aiaf replied to this.

                            NewDwarf so the way I understand DisplaysShouldNotBeTVs is that there's either another IC exclusive to the built-in panels which does dithering on the data received from the DCP. Or there's another piece of code in the GPU/DCP pipeline that does dithering for the built-in panel regardless of what enableDither is set to. Or some kind of color/brightness/voltage/temperature compensation/correction exclusive to the built-in panel has some dithering as a side effect. Time will tell.

                            dev