aiaf

I guess the SDI is only possible way right now to get true lossless capture (not the elgato cards).

Never mine, I see. Currently, I am using a highly lossy cheap hdmi capture device not to measure but as way to reduce my symptoms when using a dithering windows 11/nvidia setup simply by not faithfully reproducing the dithering patterns when viewed through OBS running on a safer or less dithering secondary device.

aiaf suspects there's a 2nd layer of dithering that happens before the buffer gets displayed on the built-in panel.

This is not clear for me… The dithering, by definition, is something that has only effect (on human eye) when light is emitted by the pixel/sub-pixels.

You already proved objectively (by the Blackmagic recorder) that dithering is totally gone when dithering is disabled.

If some kind of dithering had been presented, we would have noticed it on the diff video. It doesn't matter how often sub-pixel flickers, the difference will happen on the static picture sooner or later.

…BTW I also ordered the blackmagic recorder toy. The problem is I don't have any Apple silicon laptop.

    NewDwarf the Asahi Linux project has made giant strides in reverse engineering the DCP interface, and have written clients/drivers for it https://github.com/AsahiLinux/linux/tree/asahi/drivers/gpu/drm/apple

    macran has hours-long streams where he does this https://www.youtube.com/@marcan42/search?query=DCP it's awe-inspiring really

    I already have a theory about forcing a bit depth but I will share it in due course.

      NewDwarf video capture card can only ever record the processing that macOS uses for external display output. it can't record whatever changes are occurring for internal display.

      for example, uniformity2D is a parameter that controls a "fade to black" effect at the edges of specifically the internal XDR display. when toggling this, you can clearly and obviously see the brightness of the screen edges changing on the MacBook screen itself.

      however, this parameter will not change anything for external display output including any HDMI/DisplayPort capture card recording. (even though capture cards are a good way to measure the quality of external monitor output, and still provide way more information about how external video signals are being messed with than e.g. a basic QuickTime screen recording which will tell you nothing)

      for example, trying to tell external output to turn uniformity2D on won't make a black fade ever show up on an external monitor, the property only ever controls the internal miniLED panel, which can't be captured through video out.

      aiaf really good to know, does this mean it's reccomended to leave VUCEnable at system default, because trying to disable it would actually introduce more flicker?

      • aiaf replied to this.

        NewDwarf so the way I understand DisplaysShouldNotBeTVs is that there's either another IC exclusive to the built-in panels which does dithering on the data received from the DCP. Or there's another piece of code in the GPU/DCP pipeline that does dithering for the built-in panel regardless of what enableDither is set to. Or some kind of color/brightness/voltage/temperature compensation/correction exclusive to the built-in panel has some dithering as a side effect. Time will tell.

        DisplaysShouldNotBeTVs for my single experiment, leaving it on is better. But I have to repeat this measurement multiple times to really verify. What do you notice in the pattern/tone of banding when VUC is disabled?

        (Edit: I meant leaving is on. Leave VUCEnable = true as is)

          aiaf interesting, actually very surprised if it's a true 10-bit panel

          do you suspect the M2 MacBook Air with "support for billions of colors" is also a true 10-bit panel?

          given that in comparison, the M1 MacBook Air specs say the internal screen only supports "millions of colors", despite macOS still forcing desktop bit depth to 10 bpc on both Airs

          on the other hand, the other possible scenario is the M2 Air uses FRC on the panel to achieve "10-bit", like how some 10-bit capable external monitors work, and the M1 Air simply chops off the last 2 bits of precision

            aiaf i notice the banding gets more significant/less precise on some transparency effects and shadows, especially when Software Brightness is lower. (BetterDisplay feature, controls brightness of the color profile/gamma table for reference.)

            there are also some very specific shades of gray where when VUCEnable is default (on), my brain will feel like they are not totally solid and have some blurry reddish blotches. when VUCEnable is off, this will become obvious — the parts i suspected were more reddish will have more obvious reddish "blocky irregular pattern of larger squares" filling those areas. you can find these shades by slowly moving the Software Brightness slider while looking at a solid gray background, some levels will cause this pattern to immediately appear.

            (i say "more obvious" but you still have to look closely to see them, it's just that an actual pattern of squares that always remain at the same position is visible now)

            i did notice and do agree that leaving VUCEnable at default (on) actually did make me feel the best, despite the increased banding with it off. just disabling dither and uniformity2D seemed to create the most comfortable screen so far (relatively, still a lot worse than other laptops lol). i did feel like i noticed some strange flicker with VUCEnable off.

              NewDwarf true. It's not a fully fledged driver that respects every aspect of IOMFB.

              aiaf I'm starting to suspect that my MBP 16" Max display is true 10-bit.

              Could you please enter the command

              ioreg -l | grep IODisplayEDID

              to try to get the EDID value. Probably, getting this information, we will know what is display on your laptop.

              • aiaf replied to this.

                aiaf Should I send it to you in a private message with the pastebin link?

                • aiaf replied to this.

                  NewDwarf nothing pops out. Built-in displays do not use EDID AFAIK.

                    aiaf manufacturer and bitdepth exists in IORegistryExplore. Not sure if they can be changed tho. They're mostly in nested structures. Mine was at 8 for the built in display. If I remember correctly I could change it below 8 as well with EDID for an external display

                    aiaf nice example. I turned off dark mode in MBA M3, and its a bit better to my eyes. I don't know why, but gray/black colors giving eye strain more than light colors. If i'm opening VS code with dark theme its causing some eye strain and slight nausea.

                    dev