Sunspark there's clearly so many variables to be tested. What are you sensitivities and which display are you currently comfortable with?

Sentiny it should be plenty safe. I don't like True Tone personally but I don't think of these options will re-enable dithering, but again can only be known with proper testing. You can also re-check iroeg to see enableDither was set to Yes again.

Donux try your phone manual video mode option with the fastest shutter speed. You need to test an image with a white background. Do you trust this method?

async the idea would be to edit the EDID in the capture card, and then capture the output to see how the a particular config affects the image. You can then diff and run the capture through ffmpeg or whatever other tools to see the difference.

twomee Which external display brand and model are you using?

DisplaysShouldNotBeTVs I'm not working on an iOS version since I don't use iPhones and currently do not have the time to pursue this. I'll tinker around but no promises. In any case, my code and methodology are open source so anyone can pick up where I left.

photon78s thanks for the Carson MicroFlip recommendation. Because despite disabling dithering at the GPU level there's the question of whether the external display panel does dithering/FRC on its own. Can't trust manufacturer specs when it comes to these things.

markdotpeters5 do you have an M1 Mac? Download the app and try it. I tested with M2 and M3 and it worked flawlessly. Would love an M1 datapoint.

    DisplaysShouldNotBeTVs would the m2 MacBook air 15" also be safe or only 13"? In terms of LCD/mini led free. I think I'm going to buy one 😅 unless we can get a fix on Intel/Nvidia too?!

    markdotpeters5

    I was going to try the M1 Air (hard for me to switch ecosystems) at some point but these observations… It might still be a combination of extremely high frequency dithering plus the high frequency PWM. I remember watching an Apple event in the past couple of years where they had someone come up to talk about "screen safety". Seems hilarious to me.

      photon78s I'm very sensitive to stuff like this but use the M1 Air fine. Phone wise I'm stuck on the iPhone 11 - I literally can't find any newer handset that I can use, Android or iPhone.

      I have no real reason to upgrade from the M1 Air at the moment, but I really would like a newer iPhone!

      DisplaysShouldNotBeTVs that's an amazing setup! Feel free to get in touch any time.

      w/r/t to an additional layer of dithering on XDR displays-- will need proof for this. I see the banding clearly (on the Lagom gradient test) on the built-in display when I disable dithering, and the screen becomes a lot more eye-friendly. There's still PWM. But how would additional dithering look like and how would it play along with GPU dithering? You can only show 1 pixel color per refresh? Spatial dithering perhaps? We should also not forget these are supposedly high-end 8-bit panels

        Will this work on a m2 Mac mini?

        DisplaysShouldNotBeTVs and would love to provide any info I know that would help in creating an Intel version

        @NewDwarf has a technique here for disabling temporal dithering on Intel Macs and also AMD (it's different technique than the one linked by op). I don't believe it was tested with a lossless capture card but some have reported success on Intel Macs.

        DisplaysShouldNotBeTVs There's two layers of dithering on XDR Macs 😱 one at GPU level (which Stillcolor does disable) and one at display panel level (which it can't disable)

        This is interesting and unfortunate. It's hard to imagine two separate temporal dithering algorithms look good on a screen even to "normal" people?

          smilem

          Thanks.

          From Harrison at https://ledstrain.org/d/125-monitor-overclock/3:

          "If dithering is the problem, a higher Refresh rate will also lead to a higher dithering frequency. Always the half of the refreshing rate of the display."

          Good to know if true. Well, a multiple factor approach is needed. Too bad the Xiaomi 14 Ultra 1920fps mode is not really 1920fps.

          aiaf

          Thank you! I hope now we can really see how monitors and maybe cables behave differently independent of software issues.

          smilem AFAIA dithering can be disabled with Intel GPUs using the Dithering app and on Nvidia GPUs using the app ColorControl. Unless you know any different?

            This is amazing, after doing visual therapy for years, trying to getting adjusted to see these new screens as the unaffected people do (with significant measureable success, but never 100%) and pushing plethora of eye exercises here with very variable results from person to person, this is the tech solution Ive been hoping for.

            However I only have 2018 Intel and AMD mac, is there any chance this could be made work for those too? I have old Mojave on it, postponed updating as long as I could due to distrust in updated graphics in new OS.

              martin The ideal solution is to have 1 universal app for disabling dithering on all Intel, AMD, and Apple Macs, and also a feature that reports whether a display is true 8-bit or 10-bit, as opposed to 6-bit+FRC, and 8-bit+FRC which a lot of monitor enthusiasts seem to dismiss as an irrelevant detail, but we know that it makes all the difference in the world. What displays are you using right now? Are they true 8/10-bit?

                twomee is it this model? https://www.rtings.com/monitor/reviews/gigabyte/m34wq

                RTings reports that it accepts a 10-bit signal and displays it as 8-bit+FRC, despite being advertised as 8-bit only.

                There's a chance that your Linux distro is sending an 8-bit signal, so you get no dithering, while your Mac is sending a 10-bit signal (dither-free with Stillcolor), yet since your display is not truly 10-bit it applies dithering.

                  dev