aiaf This is the best info we have… at least two manufacturers and potentially a third.

https://www.macrumors.com/2022/01/05/apple-mini-led-supplier-hits-quality-hurdle/

Apple currently uses just two suppliers of mini-LED chips, the main one being Taiwan-based Epistar and the other Germany-based Ams Osram. Epistar intends to expand its already fully utilized chip production capacity to Taiwan and China, while Ams Osram began supplying Apple in the second half of 2021.

China-based LED chipmaker Sanan Optoelectronics was thought to be next in line to pick up Apple's business, with Sanan originally expected to become the third supplier of mini-LED chips for Apple as soon as the fourth quarter of last year.

I've witnessed the effects of there being two manufacturers "with my own eyes" so to speak. I have 14" M1 Max MBP, "friend A" has 16" M2 Pro MBP, and "friend B" has 14" M3 Max MBP.

My M1 and "friend B's" M3 look absolutely terrible to my eyes, have a non-uniform-feeling backlight, and a feeling of "glowing text" (yes, glowing against all backgrounds, not just black). The screens have a very subtle greenish tint on white.

On the other hand, "friend A's" M2, which is also an XDR Pro, looks so much better. It's of course still using temporal dithering, but is a lot easier to look at, text doesn't glow, and looks noticeably sharper. The screen instead has a reddish tint on white instead of green. It also has less of that infamous "fade to dark" at the very edges that mini-LED displays usually have.

Friend A could also tell that our screens looked different, even though he isn't screen sensitive. He said mine looked more "plastic", for whatever that's worth…

Also, friend B with the other "bad screen" said he couldn't see the "glowing text" that I was trying to point out, so this aspect of whatever kind of panel his and my laptop are using seems to only be visible to some people's eyes.

(I honestly actually like looking at friend A's laptop more than even some Intel Macs, even though I still get some of the symptoms of dithering.

But I can't stand looking at mine or friend B's… mine has only become remotely "tolerable" for the first time with Stillcolor and still has so much more "glowing text" and blurriness than friend A's at default settings.)

All 3 laptops are running the same Sonoma version and are all set to native Retina resolution and the default Apple Display XDR reference mode. All 3 have Night Shift and True Tone off.

However, I've seen a 16" Pro in the wild at one point that looked just as bad as my M1, so this is not a "14-inch vs. 16-inch" deal.

aiaf

Here's mine. Intel 2015 15" Retina MacBook Pro with AMD graphics on macOS Monterey.

https://drive.google.com/file/d/15yOrZe3ypPX3u6ciOQfHTwmGBUG-xbnO/view?usp=sharing

Search for "dith". There are two interesting keywords that pop up…

"kConnectionControllerDitherControl" and "dith = 1" 🙂

I think this will help:

https://developer.apple.com/documentation/iokit/1505962-anonymous/kiodisplayditherdefault?changes=la___2

Look at the sidebar on the left 👀

kIODisplayDitherAll = 0x000000FF
kIODisplayDitherDefault = 0x00000080
🐚 kIODisplayDitherDisable = 0x00000000 🐚
kIODisplayDitherFrameRateControl = 0x00000004
kIODisplayDitherRGBShift = 0
kIODisplayDitherSpatial = 0x00000001
kIODisplayDitherTemporal = 0x00000002
kIODisplayDitherYCbCr422Shift = 16
kIODisplayDitherYCbCr444Shift = 8

In the dump, kConnectionControllerDitherControl is set to "0x00808080 : RGB Default,444 Default,422 Default", where "default" implies dithering is most likely enabled.

The last 4 bits for 444 and 422 are of course for the YCbCr external monitor modes, RGB matters most for internal display. Getting the whole thing to just be 0x00000000 would be ideal. And also setting whatever "dith" is to 0 instead of 1.

The dump was saved with the Mac in AMD dGPU mode. Another dump I saved when the Mac was in Intel iGPU mode also uses the same keywords, so whatever OS-level dither disable method is found for Intel Macs can probably work for both.

I say OS-level because it seems like this is an additional layer that macOS adds in addition to whatever the GPU's native dithering does. (yes, this is a real thing.) dither=0 boot-args actually does disable "color table dithering" on the 2015 in Intel mode, with banding slightly shifting between color profiles (but still seeming more precise than what the display should be able to show.). I've even figured out a way to disable AMD's GPU dithering (yep.) which I'll talk about soon enough. But even with those changes, there's still obvious text shimmer and dithering symptoms on both GPUs that I think can finally be resolved when we figure out how to change these IOKit parameters.

    DisplaysShouldNotBeTVs I'll be interested in reading what you say about AMD later. It's frustrating because in Linux older versions of the driver used aticonfig or amdconfig to set whether dither was on or off for individual output ports, but that program isn't around on my install so I'm not even sure if anything is exposed at all anymore.

    DisplaysShouldNotBeTVs thanks for this! These were the first dithering settings I've encountered but of course they were not relevant to silicon Macs. They seem trivial to modify using the same methods used in Stillcolor. Can you also send me the Intel GPU AllRez? Will contain important details for matching with an IOService.

      aiaf Alright here's the 2015 in Intel graphics mode.

      https://drive.google.com/file/d/1BIhctVrKJsT_KYeltN3M8T8kE7fUuZqE/view?usp=sharing

      This dump seems to also include some AMD info too, but now it starts with AppleIntelFramebuffer instead so it should have what you're looking for.

      Looks like here we also have kConnectionControllerDitherControl but now "dith" isn't present anymore.

      So "dith" might be AMD-specific whereas "dither control" is universal/OS-level

      (but I'd love to be able to set dith to 0 as well)

      aiaf

      When I use that monitor with my Windows machine, I have the Nvidia control panel settings set to 8 bit and using color control set to dithering disabled. That may be why I see less activity under the scope prior to testing the Mac. I also don't have the ACM toggle setting visible in on Windows (need to try to disable it in registry).

      • aiaf replied to this.

        photon78s Try Windows with 10-bit and dithering disabled and tell us what you see!

          IOMobileframeBuffer Properties

          Here's a diff of the top-level IOMobileframeBuffer properties present/true on the built-in display vs a Samsung G7
          https://www.diffchecker.com/KWNHGET2/

          Can any display experts guess what these things do?
          I believe temperature compensation is rather standard stuff on LCDs but what about the rest? enableDarkEnhancer looks particularly interesting.

          When I get I chance I'll try to play with these but it's going to be tough without knowing what to observe/measure, also I risk possibly ruining the display.

          APTDevice = true;
          APTEnableCA = true;
          APTEnablePRC = true;
          
          BLMAHOutputLogEnable = false;
          BLMAHStatsLogEnable = false;
          BLMPowergateEnable = true;
          BLMStandbyEnable = false;
          BypassPCC2DLed = false;
          
          DisablePCC2DBrc = false;
          DisableTempComp = false;
          
          IOMFBBrightnessCompensationEnable = true;
          IOMFBSupports2DBL = true;
          IOMFBTemperatureCompensationEnable = true;
          
          PCC2DEnable = true;
          PCCCabalEnable = true;
          PCCEnable = true;
          PCCTrinityEnable = false;
          PDCSaveLongFrames = true;
          PDCSaveRepeatUnstablePCC = true;
          
          VUCEnable = true;
          
          enable2DTemperatureCorrection = true;
          enableAmbientLightSensorStatistics = false;
          enableBLMSloper = true;
          enableDBMMode = true;
          enableDarkEnhancer = true;
          
          enableLAC = true;
          requestPixelBacklightModulation = false;
          uniformity2D = true;

            aiaf

            Here are dropbox uploads of 240 fps microscope videos of Mac Mini M2 versus Legion 7i both connected using same usb-c to displayport cable to 144Hz LG monitor (the LG does not have a USB-C port) set to 100% brightness. The 7i is showing 8 bit in advanced display settings on Windows 11 23H2 (build 22631.3235) and I'm using color control to disable dithering supposedly. GPU is set in BIOS to Nvidia RTX 4080. I will do another round of testing when I have Windows 1809 running.

            The microscope was recording the central section of the lagrom gradient (banding) page. The videos titled "dark" are of the center part of the largest moon of this picture: https://beta.digitalblasphemy.com/2024/02/06/sinua/

            Can you see the differences? It looks to me the flicker is different between the two. Also check the gradient tests for still color OFF. For those who are really sensitive to visible flicker, please don't watch these videos.

            https://www.dropbox.com/scl/fo/os2v37wcx5n33om1uajmp/h?rlkey=73kzo0fi93qxbytwbojjmcyq9&dl=0

            • aiaf replied to this.

              photon78s are these slowed down or in realtime? What is the FPS of the original recording? QuickTime is showing 30fps and if your screen is at 144Hz then we are simply missing a ton of frames and it's difficult to say whether something or the other is taking place.

                aiaf

                These are straight out of camera original unedited files from the samsung s10+ "slow motion mode". When you are playing it back in quicktime, you are watching the motion slowed down. Looking through the scope with naked eye, I definitely cannot see this pixel flicker.

                I see the problem with this technique: https://eu.community.samsung.com/t5/galaxy-s10-series/i-can-no-longer-record-slow-motion-on-my-s10-and-output-mp4/td-p/1528847
                The default camera apps does not record real-time speed, everything is slowed down already.

                Will try setting the screen to 60hz and use the Nikon Z7 for true 120 fps 1080p video.

                Thank you. Now the search continues for better external monitors!

                  aiaf i'm going to try to guess a few…

                  BypassPCC2DLed (and DisablePCC2DBrc) could this possibly be able to disable the "2D" array of mini-LEDs and force a uniform backlight?

                  IOMFBBrightnessCompensationEnable given that this is false on external but true on internal, this might be related to the uneven colors i was mentioning? i bet uniformity2D and NormalModeEnable are related too because the laptop seems like it's trying to "normalize" the brightness over areas of the display (and failing at that LOL)?

                  enableDarkEnhancer i bet this is dithering related for sure

                  enableLinearToPanel if this is what i think it is, it might be "converting linear (exact) colors into whatever the panel color space is". if so, it would be good to disable, because that might reduce color management / help prevent colors from being modified

                  interesting i tried to edit your code to attempt to change these and i get "IOKit not permitted" on all of them except for enableDither which goes through fine, any idea why that might be the case and how to possibly modify these other properties?

                  • aiaf replied to this.

                    aiaf The flicker frequency matches the refresh rate. I am recording the flicker at 240 FPS.

                    I can't yet say for certain that FRC color dithering is the source of flicker, but I can rule out PWM, since it tracks with refresh rate. I don't know the exact behaviour of LCD inversion, but at full white, there is no visible flicker. I would think full white would also have some flicker from inversion bias, but I am not sure if that artifact is expected to only occur on lower voltages. There is also what appears to be a slight randomness to the flicker intensity per sub-pixel. Which resembles FRC dithering. The problem is I have heard that Apple might intentionally source slower response time panels, to allow for better blending of temporal dithering signals. This latency makes it difficult to make sense of the flicker signal. One other factor is I don't really see any change in the noise signal for the better under the microscope when enabling Stillcolor. If there is a layering of temporal noise happening, I would maybe expect to see more flicker when disabled. It still makes more sense to me that there is a static spatial dithering layer being removed by the command, and only on the built-in display. And we know from IOGraphicsTypes.h that there is a spatial dithering option.

                    I also tested an external OLED display. The Gigabyte AORUS FO48U. This display is OLED, and apparently has 10 bit native color. It also apparently has no built in FRC color dithering. But it also shows FRC color dithering under the microscope. I think I trust this is FRC dithering on that display, since the flicker frequency varies with refresh rate, OLED's do not have LCD inversion, and the sub-pixels light out of order at times. There is no muddiness to the effect, since the response time of the OLED is near instant.

                    What is interesting is that when enabling Stillcolor, I see the difference in the Lagom Test Gradient on the built-in LCD, but not on the external OLED display. Again, my guess is the spatial dithering layer is removed by the command, but was only applied to built-in displays. But right now, it's all more of a guess.

                      Blooey

                      I am testing the M2 mini with a 144 Hz IPS LG display. When I look at the Lagom Test Gradient on that monitor and toggling Stillcolor, I only notice a slight brightness shift but the banding looks the same to my naked eye. I went into terminal and verified that dithering is off.

                      @photon78s

                      I have the same setup as you, with no banding using USB-C to DisplayPort, only a minor shift in brightness. However, when using HDMI to HDMI, I experience banding with no relief. The issue lies in how macOS renders the UI, especially when I am on the scaling 'retina' mode. It exacerbates the 'side effects' for me. I strongly believe that retina mode creates a pattern glare that negatively affects my eyes and head. A quote from Reddit perfectly describes my situation: "For example, a 5K Mac screen outputs 5120×2880 and then divides it by 2 to display a 2560x1440 "retina" resolution.
                      This process (dividing) can be seizure inducing for me, as it creates a pattern glare where everything appears to be moving and vibrating. Unlike Windows, which does not use this type of rendering and simply zooms in without dividing the resolution."

                        anon123

                        That's interesting. I'm new to modern Macs as you can see (my last Mac that I truly used for significant period of time was the CRT iMac lol).

                        At this point I am a bit tired of chasing new monitors and hardware swaps. Focusing now on how to build up defenses (breaks, eye washes with water, acupresssure, etc. ) and using the screens less.

                          anon123 when macOS divides by exactly 2 it works by just doing a 200% zoom, the same as Windows, so the vibrating feeling they had is probably related to usual Mac temporal dithering. Sounds just like the dithering symptoms I get on iPhone SE 2, most Macs, AMD video cards etc.

                          however, yes there will be additional distortion and moire patterns on not-exactly-2 scale factors (this is where it does differ from Windows). But exactly 2 should be fine, so they're definitely just feeling the symptoms of dithering

                            photon78s I'm not sure what to make of these videos. If your Windows setup is outputting an 8-bit signal (with dithering disabled) over to an FRC display and you're measuring a flicker, then that says more about the display than your particular OS+GPU config.

                            We've already seen that Stillcolor disables Apple GPU/DCP dithering in the Mira e-ink monitor (USB-C to ?), Blackmagicdesign capture card (HDMI to HDMI 1080p60), and via user testimony.

                            If you can take microscope footage on a monitor that's proven to be FRC-free at a particular bit depth, showing no flicker, then take footage using a OS+GPU combo on the same display that demonstrates flicker, then we can start diagnosing the software.

                            Edit: does your phone have a "Pro Video" mode at at 120fps FHD? If you can set your output's refresh rate to the minimum possible (30fps?) and and take scope footage with 120fps, it will probably be easier to analyze.

                              dev