aiaf i notice the banding gets more significant/less precise on some transparency effects and shadows, especially when Software Brightness is lower. (BetterDisplay feature, controls brightness of the color profile/gamma table for reference.)

there are also some very specific shades of gray where when VUCEnable is default (on), my brain will feel like they are not totally solid and have some blurry reddish blotches. when VUCEnable is off, this will become obvious — the parts i suspected were more reddish will have more obvious reddish "blocky irregular pattern of larger squares" filling those areas. you can find these shades by slowly moving the Software Brightness slider while looking at a solid gray background, some levels will cause this pattern to immediately appear.

(i say "more obvious" but you still have to look closely to see them, it's just that an actual pattern of squares that always remain at the same position is visible now)

i did notice and do agree that leaving VUCEnable at default (on) actually did make me feel the best, despite the increased banding with it off. just disabling dither and uniformity2D seemed to create the most comfortable screen so far (relatively, still a lot worse than other laptops lol). i did feel like i noticed some strange flicker with VUCEnable off.

    NewDwarf true. It's not a fully fledged driver that respects every aspect of IOMFB.

    aiaf I'm starting to suspect that my MBP 16" Max display is true 10-bit.

    Could you please enter the command

    ioreg -l | grep IODisplayEDID

    to try to get the EDID value. Probably, getting this information, we will know what is display on your laptop.

    • aiaf replied to this.

      aiaf Should I send it to you in a private message with the pastebin link?

      • aiaf replied to this.

        NewDwarf nothing pops out. Built-in displays do not use EDID AFAIK.

          aiaf manufacturer and bitdepth exists in IORegistryExplore. Not sure if they can be changed tho. They're mostly in nested structures. Mine was at 8 for the built in display. If I remember correctly I could change it below 8 as well with EDID for an external display

          aiaf nice example. I turned off dark mode in MBA M3, and its a bit better to my eyes. I don't know why, but gray/black colors giving eye strain more than light colors. If i'm opening VS code with dark theme its causing some eye strain and slight nausea.

          DisplaysShouldNotBeTVs I think MBP is using 10bit display, without FRC, according to specs from Apple. I know that Apple XDR Display is true 10bit, thats also mentioned on their website. I know that their regular Apple studio display is using FRC the same as MBA.

            DisplaysShouldNotBeTVs

            no it's a big difference

            macOS scale down
            Windows scale up

            try to disable "retina" mode in macOS and tell me if the text is more gentle and easier to read despite being super blurry

            madmozg I think MBP is using 10bit display, without FRC

            On my MacBook Pro I do see a change with Stillcolor in that it seems to reveal quantization artifacts. Waves of magenta and cyan. Seems they have kept some level of dithering in place to further boost the display beyond 10 bits and manage those waves. It still isn’t clear if this is temporal or spatial dithering.

            Thanks for those screenshots. It really seems to follow that support for X colors” is Apple’s marketing for FRC dithering.

              Blooey I also saw it with MBP 14', dont know why. Looks like software tried to send 8bit signal, but GPU of apple silicon still using dithering or something like that.

              "Support billion of colors" words used by some manufacturers like LG SAMSUNG ASUS and etc. You can check their websites and official specifications. In monitors manuals they will show you what type of display, 8bit, 8bit+FRC or true 10 bit, its easy, but I think not that easy with Apple. I wonder if this specification type forced to have by US regulators or so? And why apple don't have this kind of described documentation what kind of display etc used in their devices..

              For example asus specifications for their monitors:

                madmozg This is just a commercial trick 😉

                Apple claims about 1 billion fake color achieved by temporal dithering but they never say these are the "fake colors".

                Lot's of references to pixelcapture in the flags. Wouldn't capturing image directly from the framebuffer be enought to diff for dithering, if laptop screens are mostly dumb? I saw some old references to capturing pixel data on Google, but no idea how hard it would be to actually make a working solution here.

                Somewhat on the side, but if you could capture pixel data properly it would be fairly easy to do some logic that gives a score to how much changes there are to the screen. You could even calculate in a way where it would ignore things if the same image is shifted slightly in one direction for the next frame, so it gets a high score if you drag or scroll with choppy movement. There are massive differences in how choppy things are in browsers as well, based on mouse used, how the drivers interact with the browser, if smooth scrolling behaves well with it, if something is locking the ProMotion VRR among other things.

                • aiaf replied to this.

                  I have a Lumix S5ii and this camera able to do 30 shots per second. Maybe I can somehow capture dithering on MBA? Like setting proper shutter speed. And after using ffmpeg to make a video and see any differences/pixel flickering. I don't have any micro lenses unfortunately, to make a really close photos to display 🙁

                  Did anyone experiment with the logging flags? I couldn't figure out where they end up. Don't thing they are written to disk automatically at least.

                  dev