DannyD2 Interesting. Couldn't listen to it with xpcspy. However I found a tool in it's folder for manually testing / tweaking the monitor colors. Not sure if it saves to color tables.

/System/Library/PrivateFrameworks/AmbientDisplay.framework/Versions/A/Resources/Calibration\ Assistant.ap

Hunter20 Pretty sure this is just the "slight brightness dips on every frame" form of PWM that Apple uses, it happens across the entire display at once and if recorded at 240hz can sometimes be seen "scanning" from top to bottom.

It's not temporal dithering because dithering can almost never be captured on a phone camera. And I've discovered that this PWM is even "technically" there on white, just hard to pick up by the camera but you can see it by turning exposure super far down.

It happens on a surprising amount of Macs: 2015 15" rMBP also has the flicker, 2015 12" MacBook has it, almost all of the Apple Silicon Macs have it (the only exception being the M2 Touch Bar Pro, which does not flicker on camera at all no matter what shade of gray is onscreen)

A lot of reviewers fail to pick it up because it can't be easily caught on camera if showing a white background or especially when showing a photo. But it is still there, and becomes very visible when medium grays are onscreen.

However, I seem to not be sensitive to this form of PWM probably due to its low flicker depth (despite being really sensitive to a lot of other PWM types) since I currently use an M1 Air, which has the flicker issue, but is a very usable screen which can fully disable dithering via Stillcolor. It's actually much more usable IMO than the "worse, very straining kinds" of M2 Touch Bar Pro units with "FMX" in the panel ID (despite even these "bad" M2 TB panels not flickering on camera). Although, there are also good M2 TB panels as well.

DannyD2 probably hard to use this hack as-is nowadays because of read-only system volume.

However, the same result is still totally possible today, probably can just set up a simple script that just re-triggers the BetterDisplay command to "set contrast enhancer strength to 0" every few seconds.

    new-jdm yeah this makes sense, probably only relevant for internal screens and Apple Studio Display / Pro Display XDR which also have ambient light sensors

    @async just discovered the biggest improvement to my MacBook internal display screen comfort in years, and it's something totally unrelated from flicker. Fixing temporal dithering flicker already improved my m1air so much, but this just makes it even better.

    since m1air is already usable "for the most part" for me since it can FULLY disable dithering with Stilcolor and i'm not sensitive to its form of PWM, i now have started to look into more general ways to improve screen comfort aside from flicker. i've discovered something really interesting…

    will save a post with more fleshed out and easily readable instructions for later as i'm really busy right now, but i want to put this out here:

    HUGE REDUCTION in strain by forcing my m1air's Retina display to act as a *TRUE* non-Retina display.

    Simply setting a "non-retina" resolution like 1280x800 @1x (AKA half of physical resolution) does NOT achieve this. That just activates Apple's scaling filter. It looks extremely blurry, since it forces a smoothing filter even in cases where simply "pixel doubling" would be possible. And it means you also get whichever mystery "oversharpening" artifacts Apple has decided to include.

    -

    Here's how to get "true" non-Retina:

    (FYI: Of course, also disable dithering with Stillcolor. This is totally different and doesn't replace disabling temporal dithering which is equally as important.)

    • To get the same exact result as me, disabling macOS font smoothing is reccomended.

    • Create a BetterDisplay virtual display with 16:10 aspect ratio but don't mirror it!

    • Instead, activate Screen Streaming on the virtual display and set the target to your laptop's internal display. ⚠️ FYI BetterDisplay Pro is required!

    • Set your physical display to true native resolution. The "Really Tiny UI" one. (for example, 2560x1600 LoDPI for m1air, or 3024x1964 LoDPI for 14" Pro)

    • Set the virtual display to non-Retina resolution. (filter Virtual Screen Mode by LoDPI, then pick the LoDPI version of your true native resolution divided by 2.)

    • Make sure "Displays have separate Spaces" in Desktop/Dock/Mission Control settings is checked.

    • Start Screen Streaming, get to the point where you're basically seeing the second virtual display projected onto your real display but with streaming instead of mirroring.

    If BetterDisplay gets into a glitchy state where you can't see your primary desktop windows anymore (which happens the very first time you set it up, but not after), just blindly Cmd+Space Spotlight search for BetterDisplay, press enter, and then press Cmd+Q to quit it to get your windows back. Then restart BetterDisplay.

    • Set the virtual display as your primary display. Use the Arrangement feature in macOS Display System Settings to align your physical display to one of the corners so you don't accidentally move your cursor off the virtual display.

    • Enable Resume Stream on Connect in the virtual display's Screen Streaming menu so the stream reactivates after resuming from sleep.

      (FYI this means your login screen will no longer have the password field since the stream doesn't show up there and you're instead just looking at your physical "secondary" display. The field is still focused on the virtual display, just "blindly" type+press Return or use Touch ID and you can still log in just fine.)

    ⬥Now here's the magic step:

    • Enable "Integer Scaling" in the Screen Streaming menu.

    Your internal display now should show a truly sharp, pixel-doubled "non-Retina" desktop.

    🐚🐚🐚🐚🐚🐚🐚🐚Youarefinallyhome   !!!🐚🐚🐚🐚🐚🐚🐚🐚

    -

    (BTW: If you now have a "blurry" mouse pointer, just set a custom pointer color in accessibility display settings, it will make the pointer sharp again.)

    Turns out that I am sensitive to high-density displays. Even independent from flicker, I've realized it's another one of the core reasons why I get that feeling of a "false sense of depth".

    On a Retina/HiDPI display, there's nothing you can really "lock onto" and easily focus on at a density where you can't even see pixels — especially in the case of text. The only time you're able to see "noticeable edges" at a HiDPI-level density are at the corners of sharp rectangles, and that can end up making those parts of the screen feel "very distracting" while you're trying to read something else.

    So what if instead of that, we make every single pixel all become equal, uniform, and visible 2x2 sharp rectangles. That's what "true" non-Retina does — and what the usual "filtered" non-Retina modes don't.

    Even though the physical pixel grid remains invisible after forcing "true non-Retina", suddenly just the simple fact that now I can clearly see the squared-off "pixels" at the edges of text and rounded windows improved my ability to focus and the feeling of "flatness" of the display by an incredibly large amount. IMMEDIATE and extremely noticeable improvement in comfort, reading ability, everything.

    It seems like Retina/HiDPI displays are simply too sharp, too overstimulating for me while actually working on them especially with information-dense content (and not just watching videos or playing games).

    "Dumbing them down" into truly looking like a non-Retina display (instead of an "approximation of one" with blurry filtering) makes a world of difference for me.

    I'm pretty sure that "seeing pixels" lets my brain know I'm looking at something obviously fake — vs. a usual HiDPI display which looks much closer to reality and probably takes "a LOT more processing" for me to intuitively understand that I need to focus on it very differently than a physical object. Hence the false sense of 3D etc.

    I discovered this after using a Windows 11 VM (UTM) on my m1air which defaulted to a non-Retina resolution but the VM used sharp integer scaling to display this, instead of filtering. Using this VM in fullscreen felt more comfortable than anything else on my m1air, to the point where I started browsing the web in it because I felt so good using it. Turns out it feels so good because it's "true non-Retina" with visible pixel edges.

    Now, with the BetterDisplay Screen Streaming + Integer Scaling method, I can say that macOS as a whole finally feels just as comfortable to me as that Windows VM.

    By the way, there's even a chance that running at "true non-Retina" (AKA perfect pixel doubling instead of filtering) may also be able to reduce flicker from pixel inversion:

    Does FHD on a 4K monitor with integer scaling look like an FHD monitor?

    Yes. Moreover, games and videos at FHD with integer scaling on a 4K monitor look even better than on a monitor with native FHD resolution […] crystal-inversion flickering is almost unnoticeable.

    https://tanalin.com/en/articles/integer-scaling/#h-faq-like-native

    BTW, you can also use pretty much the same screen streaming method if you want to be able to use scaled retina resolutions without introducing Apple's oversharpening and fringing artifacts. (physical display at native resolution, virtual display at e.g. 1680x1050 retina etc.)

    Virtual Display Streaming (BetterDisplay Pro only) is what makes all of this possible. This is not possible with the basic "virtual display mirroring"!

    Final note: I've recently noticed that there is a slight difference in sharpness between setting the physical display in this process to native LoDPI "Very Tiny UI" native resolution (e.g. 2560x1600 @1x) vs. native Retina 2x resolution (e.g. 1280x800 @2x).

    In both cases, the virtual display is set to 1280x800 @1x non-Retina. This means you still achieve "properly sized larger UI" in the end after the stream is set up.

    Not sure what is causing this difference as 1280x800 @2x is technically just "2560x1600 but the UI is zoomed in by 200%", but there definitely is one.

    IMO, 2560x1600 @1x on physical (NOT 1280 @2x) feels noticeably more comfortable.

      DisplaysShouldNotBeTVs Couldn't make it fit. Need to try some more.

      Super interested in ways to turn off the oversharpening and other effects. But where in the system are these applied, as it doesn't seem to be the framebuffer. QuartzCore? UISurface? Surely things like sharpening level and fringing around text must be configured somewhere, as it would make no sense for the Apple developers to have to recompile when tweaking. What other things than ioreg and plists are there? Are there any really hidden plists not available thru PlidyEdit Pro? Even just finding exactly where the text anti aliasing plist is read would be interesting. I have yet to find anything while looking thru symbols and using ripgrep. I can't even find where the defaults for the ioreg framebuffer flags are coming from. Things like simply finding the strings for things like enableDithering. Really interested in input here. @aiaf @waydabber

      Also, you really should try my striped overlay to se if it makes it easier to focus the screen. Do you have npde/npm installed I find it easier to read text when focusing on the white space below text.

        async Couldn't make it fit.

        what do you mean by "make it fit", do you mean being able to choose the exact 1512x945 resolution for the virtual display?

        it's a special case for the 14"/16" models as they use nontraditional resolutions, you need to select "match aspect ratio of and associate with a display" while creating the virtual display in order to get the exact resolution to show up in the list

          async But where in the system are these applied, as it doesn't seem to be the framebuffer

          i would think it's the DCP/TCON itself (or possibly built-in to the GPU), similar to how external monitors will apply their own scaling algorithm if you're sending them anything other than native resolution

          (because for example, sending "more space" retina resolution is simply sending e.g. a 2880x1800 image to my 2560x1600 m1air display and something needs to scale that down back to 2560x1600)

          using a variation of my screen streaming workaround that leaves HiDPI on instead, you can leave internal display at e.g. real 1512x945 @2x HiDPI (14" Pro) and then your virtual display to e.g. 1920x1200 @2x (also HiDPI, to get more screen space)…

          so that way instead of the GPU/TCON/whatever, you're simply essentially rendering a fullscreen window on your physical display where the Quartz compositor/WindowServer at the OS level will do the scaling without any artifacts while still sending a "native resolution" image to the physical display in the end.

          or, as i've done, use 1512x945 @2x (14" Pro) on physical, and 1512x145 @1x on streamed virtual display with Integer Scaling checked to get a true pixel-doubled non-Retina mode.

          FWIW i actually prefer true pixel doubled non-Retina at half of the physical resolution (larger UI) in comfort compared to using a more space Retina option. before i was having trouble using 1280x800 HiDPI version as it felt too large, but i don't have that issue with the pixel doubled version, my eyes "understand it" so much better.

          for me that was 1280x800 @2x on physical and 1280x800 @1x on virtual display.

          (and this is true even considering that m1air's lower native PPI creates an even larger UI at 200% scaling than the 14" Pro would at its native resolution. Integer Scaling+@1x is finally able to make this larger UI feel comfortable for me)

          all of this only works with the BetterDisplay screen streaming, NOT mirroring, because mirroring happens after everything in the OS is already rendered and handled by the GPU at a lower level, but streaming simply creates a fullscreen macOS window with the duplicated screen image inside of it.

          async Surely things like sharpening level and fringing around text must be configured somewhere, as it would make no sense for the Apple developers to have to recompile when tweaking

          given that the scaling artifacts have existed and have remained the same since T2 Intel Macs (2018 and later), my guess is either the TCON or possibly the DCP (and in the Intel era, the T2's ARM processor+bridgeOS playing the role of the DCP here)

          i don't think it's the GPU, since the artifacts are almost exactly the same on T2 Intel Macs like the 2020 Intel 13" Pro and it would probably be difficult to recreate that exactly on a whole new kind of the GPU with the M1, i could be wrong though

          i've noticed the artifacts on:

          • 2018 Intel MBA
          • 2020 13" Intel MBP
          • Anything M1 or later

          (so not just M1)

          i've noticed no artifacts on:

          • 2016 13" MBP
          • 2015 15" rMBP
          • 2015 12" MacBook

          this is why i suspect the artifacts were introduced with the T2 ARM processor in 2018

          DisplaysShouldNotBeTVs

          BTW, I have also just tested this on my 2018 Intel Air (which has temporal dithering that cannot be disabled, even dither=0 nvram trick does not work. The 2018 Air actually uniquely has 8-bit and sRGB set as the default color mode too and doesn't even have True Tone, but still dithers anyway).

          Even on that laptop, "true non-Retina" can help improve screen comfort!

          It's not as significant because temporal dithering is still an issue on that Mac, but it now feels much more usable to me. Before, using my 2018 Air on macOS felt like it was burning my eyes. Now, it still feels "unstable" for sure, but I'm at least able to look at it for a good amount longer now.

          And it possibly may reduce the frequency that the 2018 Air's integrated GPU needs to dither as well, because every "pixel" is being made up of exactly 4 pixels… 👀

          DisplaysShouldNotBeTVs By the way, absolutely huge shoutout to @waydabber here for implementing Integer Scaling for screen streaming in BetterDisplay. I finally bought BetterDisplay Pro for this feature alone after using the limited free version for years 🙂

          Anything visually equivalent to a "pixel-doubled non-Retina mode" is actually something that I've wanted to do on Retina Macs out of curiosity for years (and can also help with using older software and websites without them appearing blurry), but didn't know was possible until just a few days ago.

          Even on Windows machines it's very difficult to do this, for example Intel only just recently added an option for integer scaling (and only for laptops with Ice Lake integrated graphics and later).

          For a while I searched "how to prevent blurry scaling for non-Retina apps" or "is it possible to use the exact non-Retina resolution on a Retina Mac with sharp pixel-doubled scaling" but would never get any results. Ended up having to specifically search BetterDisplay and then finally stumbled across this feature through a GitHub issue.

          It's a bit hidden away but I personally see it as a flagship feature of the app 😃 it should probably be highlighted more as I believe lot of people are definitely looking for this, even outside of strain reduction…

          One suggestion is that I wish there was a way to show/simulate the "hardware brightness" OSD on the streamed display. It becomes entirely invisible after starting a stream with the internal display as the target. (I want to still use hardware brightness instead of using software dimming on the virtual display)

          DisplaysShouldNotBeTVs Just had to put the resolution into the aspect ratio. This was actually quite nice together with the current flags I use. Using it with a scaled resolution, and getting rid of the excessive sharpening was really nice. I have no idea how people manage to fit anything on the screen with the default 2x resolution. Even on an 15".

          It still has some sort of slow blotching tho. I don't think it is the backlight. It moves for roughly 1,5 seconds after all changes. Mostly on light gray. I feel like I've been try pretty much all settings without getting closer to understanding what it actually is. It is applied after color table quantization and everything, but is removed by gpu dithering.

          Some interesting observations of this setup:

          1. Metal image adjustments on the built in screen is ignored. Might be because of the way BetterDisplay handles streaming. Color profiles mostly work.
          2. Color Profiles on the streaming display applies gamma or something, but completely ignores color adjustments in the ICC profile. Meaning that for example my color profiles that makes whites yellow while blacks blueish doesn't do anything. Which is a bit strange.
          3. Lag that affects the cursor is a rather annoying issue, but it might be possible to increase the cpu priority somehow of the streaming. At least with SIP off.

          Current flags:

          "enableDither": false,
          "uniformity2D": false, // true
          "VUCEnable": false, // true
          "overdriveCompCutoff": NSNumber(0), // 334233600
          "IOMFBContrastEnhancerStrength": NSNumber(0), // affects things with high values, but keeps getting wiped out
          "IOMFBBrightnessCompensationEnable": false, // true
          "IOMFBTemperatureCompensationEnable": false, // true
          "enable2DTemperatureCorrection": false,
          "APTEnableCA" : false, // no visible effect
          "APTEnablePRC" : false,
          "APTPDCEnable" : false,
          "enableLAC": false, // true, no idea
          "enableDarkEnhancer": false, // true
          "BLMAHMode": NSNumber(1), // 2 default. 1 seems better
          "BLMPowergateEnable": false, // can't see any difference
          "enableDBMMode": false, // true on m1 max, not there on touchbar
          "enableBLMSloper": false, // true
          "APTEnableDefaultGray" : false, // no idea what it does
          "DisableDisplayOptimize": NSNumber(1), // 0, not sure if stable
          "IdleCachingMethod": NSNumber(1), // 2, disables a flag that switches back and forth on activity. prevents colored cursor from switching color profile upon software/hardware cursor.

            async It still has some sort of slow blotching tho. I don't think it is the backlight. It moves for roughly 1,5 seconds after all changes. Mostly on light gray.

            yep, it's some kind of compensation for local dimming zones pretty sure. saw it all the time on my mini-LED after messing with color settings.

            since your contrast enhancer strength is 0 — even though i can get a ""similar"" animated blotching effect to show up on my m1air by raising that — it's definitely not the same thing as what happens on mini-LED.

            i honestly do not think it's fixable until possible a true "disable local dimming" method is found (which i'm also not sure is possible).

            LCD Macs like m1air aren't affected by it though.

            For these slightly colored blotches that move over 1,5s after any metal / color table brightness adjustments or window movement, they move instantly when adjusting the native xdr brightness. Tbh that doesn't tell me much, but it might provide at least some clue as to where they are added.

            They are not the local dimming zones or anything, as they are removed with gpu dithering.

            This is different from both the dark compensation blotch from ContrastEnhancement, and the glowing led behind colored things.

            Digging around again. Just some scrambled notes.

            AppleM2ScalerCSCDriver seems relevant for scaling artifacts. It has references to a ton of adjustments done by IOSurface in ioreg. It also has a ton of dither options as seen by the bridgesupport file. Not entirely sure if these types of values can be changed somehow. Found some references to python projects using that bridgefile.

            /System/Frameworks/IOKit.framework/Versions/A/Resources/BridgeSupport/IOKit.bridgesupport
            2979:<enum name='kConnectionControllerDitherControl' value64='6775907'/>
            6253:<enum name='kIODisplayDitherAll' value64='255'/>
            6254:<enum name='kIODisplayDitherDefault' value64='128'/>
            6255:<enum name='kIODisplayDitherDisable' value64='0'/>
            6256:<enum name='kIODisplayDitherFrameRateControl' value64='4'/>
            6257:<enum name='kIODisplayDitherRGBShift' value64='0'/>
            6258:<enum name='kIODisplayDitherSpatial' value64='1'/>
            6259:<enum name='kIODisplayDitherTemporal' value64='2'/>
            6260:<enum name='kIODisplayDitherYCbCr422Shift' value64='16'/>
            6261:<enum name='kIODisplayDitherYCbCr444Shift' value64='8'/>

            ioreg values for it can be changed with ioio like this. It does however not appear to change much.
            ioio -s AppleM2ScalerCSCDriver DisableSIMDOptimizations false

            IOAccelerator also keeps popping up as well.

            Not sure what this is for, but it provides an interesting effect.
            betterdisplaycli set -namelike=built -specifier=CMDegammaMethod -framebufferNumericProperty=1

            Seems to use the M2Scaler

            userlandkernel/iokitstuff: Please contribute by reversing the kexts and implementing easy to use methods around the userclients. (github.com)
            Siguza/iokit-utils: Dev tools for probing IOKit (github.com)
            RehabMan/OS-X-ioio: Command-line utility for setting ioreg properties via IOService::SetProperties (github.com)
            Releases · macmade/IOBrowser (github.com)

              New discovery!

              My overlay with a diagonal pattern 100% removes the blotching while using screen streaming. So the algorithm that adds it targets solid color areas. I can remove the brightness to a level where it isn't even visible to me and it still does it. This is what the pattern looks like with higher opacity.

              Edit: A bigger pattern of crosses does not have the same effect, nor does scanlines. 0.5% opacity black for the lines is enough to fix it. Not entirely sure why blotches exist on the streamed display, but not on the regular monitor.

              Edit2: Not necessarily related here, but the APTFixedRR does not seem to apply at all when streaming. Quartz Debugger shows FPS higher than the monitor is set at, and it never limits it. Maybe streaming bypasses all APT stuff.

              Edit3: When streaming the blotching affects both the window that shows the streamed screen, as well as the actual screen. Meaning that the entire screen goes into some different mode that causes it.

                dev