aiaf I made a quick (dis)proof of concept of a dither busting overlay

https://gist.github.com/aiaf/8006286d0b38fe02518bf7f8a8890918

Plays a transparent checkerboard pattern every other frame over most of the screen at 60fps. I checked with my capture card, it does not alter the pixels in a significant way frame-to-frame for the DCP dithering algorithm to change its behavior. In fact, this overlay becomes another source of eyestrain and screen instability in its current incarnation. But maybe someone can experiment with different patterns, frequencies, or blending modes and see if that makes a difference.

Easier to just disable DCP dithering entirely with Stillcolor.

    async I'll give this a try with a capture card

    DisplaysShouldNotBeTVs Any idea if similar apis will be accessible on iOS when pushing apps directly from XCode, or would it require jailbreaking? Unfortunately I'm at iOS 17 now, so that most likely won't be cracked for some time. Did a lot of jailbreaking previously, but forgot about it at some point.

    @aiaf thank you so much for working on this, super excited there's a chance I can have a usable M2.

    as a note: I am using a M2 MBA 13" with a cheap eBay replacement screen I put it, kind that auto disables True Tone. Running Stillcolor I still notice same symptoms. I also see no banding, and no change whatsoever when toggling Stillcolor on/off. Tried running an RGB profile instead of usual sRGB but still no luck. Today I tried to se the banding on an M3 (I believe) 15 MBA in the Apple store and I could not notice it either.

    Do you think you will be releasing a Beta to test the added features you were testing on your personal? would love to try.

    Also to anyone what wants any info on this cheap replacement screen for research purposes, send me the commands to enter in terminal and I will pull up what I can. It seems to be identical the broken OEM one I had

    Thanks again.

      aiaf

      The Apple ProMotion mechanism automatically switches between 48/60/120hz according to the content you are running.

      Therefore I think the advantage of running something on the screen has to be researched from a different perspective.

      If you look at monitor reviews on https://www.rtings.com/ you can see that monitors show different behavior on different refresh rates. For example there are monitors who flicker if you run at a certain refresh rate (60hz), and also at the same time the same monitor doesn't flicker at the maximum refresh rate of the panel (120/144hz).

      The ProMotion mechanism drops the refresh rate to 48hz if nothing is moving on the screen, but when you move your mouse it immediately goes up to 120hz. Argument of Apple is battery optimization. There is no setting to force fixed 120hz.

      The point is if you have running a video (even invisible) on the screen the ProMotion mechanism changes in a different state. Atleast the author of this program found a notable difference: https://github.com/abinabdc/flickeringMacFix

      I cannot test it myself as I have the MBA without ProMotion, but I think its definitely something worth to figure out. For example maybe there can be different PWM rates at different refresh rates.

        Hunter20 It goes way below that. See screenshot. Doesn't matter if it is with ProMotion or set to 60 hz. Both are variable. Low / High Power also doesn't seem to matter. It can be checked with Quartz Debug (Additional Tools for Xcode). Variable refresh rate is important for smooth scroll, but tbh I don't really see much difference. In Quartz Debug you can also force disable the variable refresh rate, but that creates screen tearing when scrolling.

        There are flags for number of windows open in the framebuffer, and ofc the refresh rates change if you have anything that changes like a video in some other window. So there are tons of things that might change how it behaves.

        Is there a way to programmatically set the frame rate to a fixed number on Mac OS? IIRC the "limit frame rate" setting only caps it at 60hz. It still drops lower if it "sees fit". So its still a variable refresh rate, albeit less so.

        Findings on external monitors & bit depth

        This might be common knowledge on this forum, but I’ll share it anyway.

        I have a Samsung Odyssey G7 which is reported as 10 bits (8-bit+FRC).

        I’m using an old HDMI cable I have laying around whose provenance I’m uncertain of but the jacket reads “High Speed HDMI Cable with Ethernet.” It’s capable of a max 10.2 Gb/s transmission rate (1), (2). Let’s call it HSwE.

        I can now confirm the following behavior on an M3 Max MPB (equipped with an HDMI 2.1 port capable of 48.0 Gb/s)

        I set the monitor to 1920x1080 (HiDPI) at 60Hz.

        • When connected via HDMI-2.1 -> (HSwE cable) -> HDMI-2.0:
          • (enableDither = No): Lagom gradient shows 128 distinct bands, screen is much easier to look at. My custom-made gray test shows a very obvious and abrupt change in tone when you toggle dithering.
          • (enableDither = Yes): Lagom gradient is a lot smoother and shows 256 bands, corresponding with the change in grayscale value at 3px intervals. My gray test shows a colder gray. The gradient here looks as smooth as the gradient on my built-in display. Same gradations.
        • When connected via Thunderbolt/USB4->DP1.4 (TB/USB4 maxes out at 40Gb/s, DP 1.4 at 32.4Gbps)
          • (enableDither = No): Lagom gradient looks just as smooth as my built-in display, showing 256 bars. There’s a shimmer around the top white border of the gradient, especially at higher brightness. My gray test shows a change in tone when toggling dithering, but not as obvious as when connected via HSwE-HDMI.
          • (enableDither = Yes): gradient still has 256 bars. There’s a change in the quality and smoothness of the gradient that’s difficult to put into words, a difference in luminosity. My gray test shows a slightly darker shade of gray.
        Conclusions:

        Assuming RGB pixel encoding and no DSC (Display Stream Compression).

        • Using a HSwE HDMI cable or lower forces 8bpc. The bandwidth required to send 10bpc (32-bit signal) at 60Hz for 3840x2160 pixels (1920x1080 HiDPI) is around 14-16 Gbps + HDMI overhead. This requires at least a Premium High Speed HDMI® (aka Category 3 or 4K).
        • Increase refresh rate and resolution to increase bandwidth requirements (bandwidth calculator )
        • Using the USB4->DP cable allows the Mac to output a 32-bit signal which makes the external display apply its FRC/dithering algorithms, as evident by the above observations re. gradient banding.
        • M-series Macs apply DCP temporal dithering to the pixel buffer regardless of bit depth.
        • These Macs are capable of producing an 8-bit signal if forced to, at least based on bandwidth/speed negotiations. The machinery is there, we just have to find the right buttons.
        • So if you have a true 8-bit panel or 10-bit (8-bit+FRC), use an HSwE HDMI cable or lower + Stillcolor to eliminate temporal dithering entirely. Avoid 6-bit+FRC panels.
        • Based on the above observations re. banding and gray change, I conclude that the built-in display receives a 10bpc signal by default. The unknown right now is whether the built-in panel (at least on an M3 Max MBP) is true 10-bit or has additional temporal dither applied by the TCON (Time Controller), in addition to DCP dithering which Stillcolor successfully disables.

          aiaf Thanks for those tests. I also mentioned this in my comment before. With 4K monitors its easy to achieve those limitations with basic cables. But its hard if you have 1080p or 2K display 🙂

          Also please take a look into chroma subsampling. https://en.wikipedia.org/wiki/Chroma_subsampling Its also very important thing.

            aiaf that's an awesome finding. I've been wondering about using a lower bandwidth cable to purposely bottleneck output. I'm sure this would also apply on other computers. I'm curious if you got a dell up2720q true 10 bit monitor that if it would also use dithering still. I did look and dell acknowledges "8+A FRC" and also "true 10bit". So I think the up2720q can be the cheapest true 10bit option if anyone wants to try it. I think you can get them as cheap as $800 form b&h photo as open box condition through their eBay store and I think their site as well.

            Pics show the true 10bit and then a 8+FRC to show that dell advertises both which makes me trust the up2720q being true 10bit

            • aiaf replied to this.

              MTNEYE I don't think that if you change your display on laptop going to remove dithering, because the connection cable from your motherboard to new display will still be the same and capable to send the same amount of data to your display, and the display will project the same, but I might be wrong. I think the same is happening with when you connecting external displays.

                madmozg The Samsung Odyssey G7 is a 1440p panel, so there's that. You want to find a combination of resolution + refresh rate that makes 10bpc transmission physically impossible on your cable.

                Re. YUV, I'm certain that output's pixel encoding is set to RGB, unless I'm misunderstanding something here.

                jordan unfortunately I don't have access to a true 10-bit panel (unless you count the built-in display if that turns out to be true). If anyone here has a Pro Display XDR you'll do us a real solid comparing it to the image of an M2/M3 Max/Pro, at least it will tell us they use the same tech.

                  @aiaf I've just reproduced how dithering works with different type of cables and external monitor with StillColor on/off. Making a video, will post in hour or two.

                  Looks very promising

                  aiaf this is exceptional. I've suspected quality chabges between cables before. Maybe there is a flag or value where we can pick up the negotiated bandwidth or info about what is active. I looked into creating a tool to continously diff parts of ioreg to show deep changes as they happen, but didn't find enough time yet. I looked around in some other places as well, and there are lots of info. Do ioreg with the -a flag to output xml and more easily see deeper hierarchical values that are mostly there for debugging.

                  What I don't get is the many references to bitdepth 8 for the built in monitor.

                  Not sure, but I guess other interference from power cables might also be capable of affecting the bandwidth.

                  • aiaf replied to this.

                    async Diffing is essential, thanks for the -a flag tip. I planned to do it for this investigation but I thought I'd post my initial findings first. Regarding Depth in the registry, it's not what you think it means.

                    Depending on where it appears, the known values are 4 and 8 (you can confirm this by looking at AllRez source code). Mine was set to 7 at some point in the past couple of months- not sure how or why or if it was me who did it.

                    • DepthFormat = 8 => PixelEncoding = "--RRRRRRRRRRGGGGGGGGGGBBBBBBBBBB"
                    • DepthFormat = 4 => PixelEncoding = "--------RRRRRRRRGGGGGGGGBBBBBBBB"

                    So a depth format of 8 means 10bpc

                      aiaf I created a continous diff solution. It's a starting point and can be adapted to other interesting areas. Finally got a working conversion of the data to json, so I used a json diff that also makes it easy enough to see where things changed.

                      Just created it, so didn't test anything yet, but you can see the AmbientBrightness changing live for example. I'm sure there are areas where you can monitor things related to the cable bandwith as well that can be found thru IORegistryExplorer.

                      This script needs jd for the diffing, so run brew install jd first, and then more or less just sh iodiff.sh

                      #!/bin/bash
                      
                      CMD="ioreg -l -d0 -w0 -r -c AppleCLCD2 -a \
                      | sed -e '/^\t*<data>/,/^\t*<\/data>/ { /^\(\t*\)<data>\(.*\)/ { s//\1<string>_data:\2/; h; d; }; /^\t*<\/data>\(.*\)/! { s/^\t\(.*\)/\1/; H; d; }; /^\t*<\/data>\(.*\)/ { s//\1:data_<\/string>/; H; g; s/\n//g; }; }' \
                      | plutil -convert json -r - -o -"
                      SLEEP_TIME=5
                      
                      OLD_FILE="/tmp/iodiff_old.json"
                      NEW_FILE="/tmp/iodiff_new.json"
                      
                      # Initialize both files before starting the loop
                      eval "$CMD" > "$OLD_FILE"
                      cp "$OLD_FILE" "$NEW_FILE"
                      
                      print_heading() {
                        echo "\033[1m$1\033[0m"
                      }
                      
                      print_heading "$(date)"
                      cat "$OLD_FILE"
                      echo ""
                      
                      while true; do
                        eval "$CMD" > "$NEW_FILE"
                        OUTPUT=$(jd -color "$OLD_FILE" "$NEW_FILE")
                        if [ ! -z "$OUTPUT" ]; then
                          print_heading "$(date +"%Y-%m-%d %H:%M:%S")"
                          echo "$OUTPUT"
                        fi
                        mv "$NEW_FILE" "$OLD_FILE"
                      
                        sleep "$SLEEP_TIME"
                      done

                        I discovered today accidentally that when I'm connecting MBA M3 to my Asus VG27AQ 2K monitor with HDMI the banding on gray gradient is changing when I'm turn on and off StillColor app. Attaching photos of display.

                        I was surprised to see this, because I was always using DisplayPort to connect to my external display. So after that I tried the same again with my regular displayport, nothing changed, the banding stayed the same as with on or off.

                        So after that I was thinking maybe the HDMI I have is lower version which could not give a proper bandwidth for 2K 10bit 144hz. So I took 2.1 version from my PS5 (I hope PS5 goes with 2.1 version tho) and tried, and I again saw a banding when StillColor was ON. After that I also realized that my monitor support only 2.0 version of HDMI. I also tried to connect with some type-c dongles, so in all cases HDMI was showing those banding when StillColor was ON.

                        After that I went to the store to find any DisplayPort with version 1.1, but its too outdated, I think it was in years 2005-2008 so probably only on ebay I can order it for testing. So right now there is no way I can try older version, I have only 1.2.

                        Conclusion from my side:

                        • MBA M3 + StillColor ON 144hz with HDMI 2.0 -> External display: ✅ shows banding
                        • MBA M3 + StillColor ON 60hz with HDMI 2.0 -> External display: ❌ doesn't show banding
                        • MBA M3 + StillColor ON 144hz with DisplayPort 1.2 -> External display: ❌ doesn't show banding
                        • MBA M3 + StillColor ON 60hz with DisplayPort 1.2 -> External display: ❌ doesn't show banding

                        Photos for 2k 60hz with HDMI and StillColor on/off:

                        As you can see when I'm enabling 60hz with 2K display there is no banding on gradient, which means that the cable have enough bandwidth to send 10bit 2K signal for 60hz, but could not send 144hz and switching to 8bit. But I don't understand why I'm able to control it with StillColor app only at 144hz, is it like because the bandwidth like almost on the edge or so and the MacOS letting you do this? Or is this how HDMI implementation works right now? I remember before M1 macbooks had problems with HDMI, using old version of it.

                        Also when connecting monitor with HDMI in settings I can select 1920 as HiDPI, but when connecting with DisplayPort I have only 1280 as HiDPI. Weird, don't know why. And If I select 1920 with displayport the image is blurry in compare with HDMI.

                        From the feelings perspective, In situation when I can see bandings I'm getting weird symptoms, even more weird than when just working with laptop screen. I'm getting strange brain fog and eye strain, also slight nausea and it comes and goes away, very weird. Its also very bad when looking at this gray gradient. Hopefully this information would help somehow 🙂

                          madmozg Totally yeah I get that, I just meant that sometimes no OEM screens yield different (usually worse) results then OEM. Basically there's no crazy increased flashing or dithering, same as before, also the display doesn't appear to be a sRGB/RGB/6 bit +FRC etc as you might think from a cheaper panel.

                          Not really related because it PWM but for example swapping a iPhone 13 mini I had from OEM OLED to LCD got rid of the PWM, with some draw backs. Also changing my iPhone 12 mini, no PWM at max brightness, to LCD got rid of the annoying dithering they added in iOS 17.

                            dev