So my Lenovo x280 and Surface Pro 2 are both strain free also without ditherig. Also connected to a samsung external display.

The same external display is producing terrible eye strain with Lenovo L13 gen2. Even with ditherig.

Windows 11 newest version in all.

So there is clearly something else than just Temporal Dithering. It is not OS or driver version specific and it is also not panel specific, something the display adapter is doing.

What is it?

    Maxx do you have windows ACM turned on? ("Automatically manage color for apps" in advanced display settings) If so, turning it off should fix this.

    The reason why some Windows 11 machines don't cause as much strain is that not every computer "supports" ACM. The ones that do — which is newer machines — will dither like crazy.

    https://ledstrain.org/d/1908-update-from-win11-21h2-to-win11-22h2-leads-to-eye-strain/145

    This "feature" was introduced in the Windows 11 2022 update and introduces Windows's own OS-level temporal dithering that runs on top of whatever your GPU already is doing.

    photon78s Probably in the case of this monitor it seems like the monitor itself is causing the dither that you see even with Stillcolor, it's most likely a 6bit + FRC panel

    (or a "10bit capable" panel using 8bit + FRC, which is being activated whenever it receives the 10bit signal that M1 Macs generate.)

    photon78s Display Specifications reports LG 27GP95R as 8-bit+FRC. So what you're measuring is likely the panel's own FRC. This will remain true as long as the output from your Mac or PC is 10-bits (can you check what Windows is sending?), and as long your display is not true 10-bit.

      Blooey This confirms my theory that specifically on XDR mini-LED Macs (such as your 16" Pro) there is some additional dithering.

      I most likely suspect is coming directly from an "8bit + FRC" panel in the Pro that is receiving the Mac's 10bit signal, similar to external monitors that include their own FRC.

      What seems to be the trend is Stillcolor makes some degree of improvement but doesn't truly "fix" XDR M1 Macs… but on LCD M1 Macs like the M2 Air, Stillcolor has much better results and can significantly improve the display.

      Remember that all XDR M1 Macs use what is called "hardware reference modes" for colors and try to hide away the standard color profile menu. On the other hand, LCD M1 Macs use standard color profiles just like Intel Macs. There is definitely a significant difference in the color management pipeline between display types.

      It's actually still possible to force an XDR Mac to load a color profile, but the interesting thing is whatever white point it has will actually be multiplied by the white point of the reference mode. This means that "reference modes" are an entirely different layer of color processing (probably done by the DCP firmware instead of the OS) and not just a simplified "abstraction of color profiles".

      (Interestingly enough, the Studio Display is the one counterexample — it's a standard LCD, but uses the new "hardware reference modes", unlike the other Mac LCDs. I would avoid the Studio Display.)

      The M2 Air also claims to have 1 billion colors (10bit), but potentially, the 10bit to 8bit + FRC down-conversion is done via the GPU on the Air instead of on the panel itself. (I've seen a few Windows laptops do 6bit + FRC entirely on the GPU, so it's certainly possible.)

      This may be why M2 Air users are reporting both better improvements in screen comfort and also more obvious changes in banding.

      I really wish I had an M2 Air instead of my XDR M1 Pro so I could see the difference myself 🙂

        Blooey how can you under a microscope determine whether it's dithering or PWM or some other type LED flicker effect? What's the FPS of your recording? From my testing, Apple dithering matches your display's refresh rate.

          aiaf This is the best info we have… at least two manufacturers and potentially a third.

          https://www.macrumors.com/2022/01/05/apple-mini-led-supplier-hits-quality-hurdle/

          Apple currently uses just two suppliers of mini-LED chips, the main one being Taiwan-based Epistar and the other Germany-based Ams Osram. Epistar intends to expand its already fully utilized chip production capacity to Taiwan and China, while Ams Osram began supplying Apple in the second half of 2021.

          China-based LED chipmaker Sanan Optoelectronics was thought to be next in line to pick up Apple's business, with Sanan originally expected to become the third supplier of mini-LED chips for Apple as soon as the fourth quarter of last year.

          I've witnessed the effects of there being two manufacturers "with my own eyes" so to speak. I have 14" M1 Max MBP, "friend A" has 16" M2 Pro MBP, and "friend B" has 14" M3 Max MBP.

          My M1 and "friend B's" M3 look absolutely terrible to my eyes, have a non-uniform-feeling backlight, and a feeling of "glowing text" (yes, glowing against all backgrounds, not just black). The screens have a very subtle greenish tint on white.

          On the other hand, "friend A's" M2, which is also an XDR Pro, looks so much better. It's of course still using temporal dithering, but is a lot easier to look at, text doesn't glow, and looks noticeably sharper. The screen instead has a reddish tint on white instead of green. It also has less of that infamous "fade to dark" at the very edges that mini-LED displays usually have.

          Friend A could also tell that our screens looked different, even though he isn't screen sensitive. He said mine looked more "plastic", for whatever that's worth…

          Also, friend B with the other "bad screen" said he couldn't see the "glowing text" that I was trying to point out, so this aspect of whatever kind of panel his and my laptop are using seems to only be visible to some people's eyes.

          (I honestly actually like looking at friend A's laptop more than even some Intel Macs, even though I still get some of the symptoms of dithering.

          But I can't stand looking at mine or friend B's… mine has only become remotely "tolerable" for the first time with Stillcolor and still has so much more "glowing text" and blurriness than friend A's at default settings.)

          All 3 laptops are running the same Sonoma version and are all set to native Retina resolution and the default Apple Display XDR reference mode. All 3 have Night Shift and True Tone off.

          However, I've seen a 16" Pro in the wild at one point that looked just as bad as my M1, so this is not a "14-inch vs. 16-inch" deal.

          aiaf

          Here's mine. Intel 2015 15" Retina MacBook Pro with AMD graphics on macOS Monterey.

          https://drive.google.com/file/d/15yOrZe3ypPX3u6ciOQfHTwmGBUG-xbnO/view?usp=sharing

          Search for "dith". There are two interesting keywords that pop up…

          "kConnectionControllerDitherControl" and "dith = 1" 🙂

          I think this will help:

          https://developer.apple.com/documentation/iokit/1505962-anonymous/kiodisplayditherdefault?changes=la___2

          Look at the sidebar on the left 👀

          kIODisplayDitherAll = 0x000000FF
          kIODisplayDitherDefault = 0x00000080
          🐚 kIODisplayDitherDisable = 0x00000000 🐚
          kIODisplayDitherFrameRateControl = 0x00000004
          kIODisplayDitherRGBShift = 0
          kIODisplayDitherSpatial = 0x00000001
          kIODisplayDitherTemporal = 0x00000002
          kIODisplayDitherYCbCr422Shift = 16
          kIODisplayDitherYCbCr444Shift = 8

          In the dump, kConnectionControllerDitherControl is set to "0x00808080 : RGB Default,444 Default,422 Default", where "default" implies dithering is most likely enabled.

          The last 4 bits for 444 and 422 are of course for the YCbCr external monitor modes, RGB matters most for internal display. Getting the whole thing to just be 0x00000000 would be ideal. And also setting whatever "dith" is to 0 instead of 1.

          The dump was saved with the Mac in AMD dGPU mode. Another dump I saved when the Mac was in Intel iGPU mode also uses the same keywords, so whatever OS-level dither disable method is found for Intel Macs can probably work for both.

          I say OS-level because it seems like this is an additional layer that macOS adds in addition to whatever the GPU's native dithering does. (yes, this is a real thing.) dither=0 boot-args actually does disable "color table dithering" on the 2015 in Intel mode, with banding slightly shifting between color profiles (but still seeming more precise than what the display should be able to show.). I've even figured out a way to disable AMD's GPU dithering (yep.) which I'll talk about soon enough. But even with those changes, there's still obvious text shimmer and dithering symptoms on both GPUs that I think can finally be resolved when we figure out how to change these IOKit parameters.

            DisplaysShouldNotBeTVs I'll be interested in reading what you say about AMD later. It's frustrating because in Linux older versions of the driver used aticonfig or amdconfig to set whether dither was on or off for individual output ports, but that program isn't around on my install so I'm not even sure if anything is exposed at all anymore.

            DisplaysShouldNotBeTVs thanks for this! These were the first dithering settings I've encountered but of course they were not relevant to silicon Macs. They seem trivial to modify using the same methods used in Stillcolor. Can you also send me the Intel GPU AllRez? Will contain important details for matching with an IOService.

              aiaf Alright here's the 2015 in Intel graphics mode.

              https://drive.google.com/file/d/1BIhctVrKJsT_KYeltN3M8T8kE7fUuZqE/view?usp=sharing

              This dump seems to also include some AMD info too, but now it starts with AppleIntelFramebuffer instead so it should have what you're looking for.

              Looks like here we also have kConnectionControllerDitherControl but now "dith" isn't present anymore.

              So "dith" might be AMD-specific whereas "dither control" is universal/OS-level

              (but I'd love to be able to set dith to 0 as well)

              aiaf

              When I use that monitor with my Windows machine, I have the Nvidia control panel settings set to 8 bit and using color control set to dithering disabled. That may be why I see less activity under the scope prior to testing the Mac. I also don't have the ACM toggle setting visible in on Windows (need to try to disable it in registry).

              • aiaf replied to this.

                photon78s Try Windows with 10-bit and dithering disabled and tell us what you see!

                  IOMobileframeBuffer Properties

                  Here's a diff of the top-level IOMobileframeBuffer properties present/true on the built-in display vs a Samsung G7
                  https://www.diffchecker.com/KWNHGET2/

                  Can any display experts guess what these things do?
                  I believe temperature compensation is rather standard stuff on LCDs but what about the rest? enableDarkEnhancer looks particularly interesting.

                  When I get I chance I'll try to play with these but it's going to be tough without knowing what to observe/measure, also I risk possibly ruining the display.

                  APTDevice = true;
                  APTEnableCA = true;
                  APTEnablePRC = true;
                  
                  BLMAHOutputLogEnable = false;
                  BLMAHStatsLogEnable = false;
                  BLMPowergateEnable = true;
                  BLMStandbyEnable = false;
                  BypassPCC2DLed = false;
                  
                  DisablePCC2DBrc = false;
                  DisableTempComp = false;
                  
                  IOMFBBrightnessCompensationEnable = true;
                  IOMFBSupports2DBL = true;
                  IOMFBTemperatureCompensationEnable = true;
                  
                  PCC2DEnable = true;
                  PCCCabalEnable = true;
                  PCCEnable = true;
                  PCCTrinityEnable = false;
                  PDCSaveLongFrames = true;
                  PDCSaveRepeatUnstablePCC = true;
                  
                  VUCEnable = true;
                  
                  enable2DTemperatureCorrection = true;
                  enableAmbientLightSensorStatistics = false;
                  enableBLMSloper = true;
                  enableDBMMode = true;
                  enableDarkEnhancer = true;
                  
                  enableLAC = true;
                  requestPixelBacklightModulation = false;
                  uniformity2D = true;

                    aiaf

                    Here are dropbox uploads of 240 fps microscope videos of Mac Mini M2 versus Legion 7i both connected using same usb-c to displayport cable to 144Hz LG monitor (the LG does not have a USB-C port) set to 100% brightness. The 7i is showing 8 bit in advanced display settings on Windows 11 23H2 (build 22631.3235) and I'm using color control to disable dithering supposedly. GPU is set in BIOS to Nvidia RTX 4080. I will do another round of testing when I have Windows 1809 running.

                    The microscope was recording the central section of the lagrom gradient (banding) page. The videos titled "dark" are of the center part of the largest moon of this picture: https://beta.digitalblasphemy.com/2024/02/06/sinua/

                    Can you see the differences? It looks to me the flicker is different between the two. Also check the gradient tests for still color OFF. For those who are really sensitive to visible flicker, please don't watch these videos.

                    https://www.dropbox.com/scl/fo/os2v37wcx5n33om1uajmp/h?rlkey=73kzo0fi93qxbytwbojjmcyq9&dl=0

                    • aiaf replied to this.

                      photon78s are these slowed down or in realtime? What is the FPS of the original recording? QuickTime is showing 30fps and if your screen is at 144Hz then we are simply missing a ton of frames and it's difficult to say whether something or the other is taking place.

                        aiaf

                        These are straight out of camera original unedited files from the samsung s10+ "slow motion mode". When you are playing it back in quicktime, you are watching the motion slowed down. Looking through the scope with naked eye, I definitely cannot see this pixel flicker.

                        I see the problem with this technique: https://eu.community.samsung.com/t5/galaxy-s10-series/i-can-no-longer-record-slow-motion-on-my-s10-and-output-mp4/td-p/1528847
                        The default camera apps does not record real-time speed, everything is slowed down already.

                        Will try setting the screen to 60hz and use the Nikon Z7 for true 120 fps 1080p video.

                        Thank you. Now the search continues for better external monitors!

                          dev