macsforme Remember that if you're only connecting to a capture card, it won't provide the same EDID that an actual monitor would which will sometimes cause totally different output behavior.

I would suggest dumping a realistic EDID from an actual monitor (using BetterDisplay's save EDID function etc), then using one of the "force EDID" methods in order to tell macOS to substitute in the capture card's EDID with the monitor's one instead. then see if the output changes.

especially the bit depth and YCbCr support components of an EDID, in addition to the fact a lot of monitors actually provide color calibration information through their EDID (this is what causes that profile with the name of the monitor to show up in display settings) totally affects how much a GPU will decide to dither

  • JTL replied to this.

    I have discovered that UEFI has its own video driver configuration - called UEFI GOP. If I understand the idea correctly, UEFI GOP works based on the EDID data of the monitor, imagine if monitor is 8 bit + FRC, you disable 10 bit in windows / linux to get pure 8 bit signal, and UEFI continues to scale it to 10 bit at its own level.

    What if UEFI GOP behavior of motherboards, someone call here "non-disabled dithering" of video cards?

    DisplaysShouldNotBeTVs I would suggest dumping a realistic EDID from an actual monitor (using BetterDisplay's save EDID function etc), then using one of the "force EDID" methods in order to tell macOS to substitute in the capture card's EDID with the monitor's one instead. then see if the output changes.

    Actually. I wouldn't trust forcing the EDID on the target device. Forcing EDID can have different functionality compared to native EDID detection.

    The DVI2PCIe is one of the few capture cards that can be programmed to output an arbitrary EDID.

    Good points. I understand the goal of replicating the exact configuration of a real display for accurate testing/comparison. My concern with programming an arbitrary EDID would be that the EDID advertises the capabilities of the display (or capture card), so I could end up with a video format that the capture card is not actually compatible with.

    Epiphan’s web site has EDID files you can download to force certain resolutions if you don’t otherwise have a way to do so. From what I gather, this seems to be the main purpose of the EDID upload feature.

    • JTL replied to this.

      macsforme Good points. I understand the goal of replicating the exact configuration of a real display for accurate testing/comparison. My concern with programming an arbitrary EDID would be that the EDID advertises the capabilities of the display (or capture card), so I could end up with a video format that the capture card is not actually compatible with.

      I think that concern is partially unfounded. As long as the color format being output by the target device is supported by the capture card it should work. And if not, just reprogram the "stock" EDID and you're no worse than what you started with.

      I tried with the ffmpeg time blend method, but so far I could not detect dithering after trying several configurations (including the 2015 15-inch Retina MacBook Pro with AMD dGPU, which did exhibit dithering on the BlackMagic capture card).

      If you send me some files and information in PM. I might be able to help.

      13 days later

      macsforme All NVIDIA cards do have temporal dithering on DisplayPort outputs, and do not have temporal dithering on HDMI outputs.

      macsforme The iGPU-only model (MacBookPro11,4) does not exhibit temporal dithering on either mini-DisplayPort nor HDMI outputs.

      After further reflection, the before-and-after of my vision strain is starting to make sense (if temporal dithering is the cause), given my finding that GPUs I tested tended to dither on DisplayPort outputs only and not on HDMI (and I assume not on DVI).

      Years ago I ran dual 8-bit (purportedly) displays on a 2014 Mac Mini, which (if consistent with my iGPU MacBook Pro) would not have had dithering on the Intel iGPU outputs. I later upgraded to a Mac Pro (with various GPUs over time) and a 3-monitor setup, but my standard refresh rate monitors (same purportedly 8-bit ones) were on either HDMI or DVI, while the DP output went to a high-refresh gaming monitor (likely reducing the effects of dithering on that output).

      Later, after ditching the horrible 14" M1 Pro MacBook Pro, I ran my Mac Pro with an AMD Radeon Pro WX 5100 (4x DP outputs) with DisplayPort going to all three monitors (including the standard refresh rate ones). I also tried some of my NVIDIA cards using the DisplayPort outputs. After that, I never had 100% of the comfort level I had previously.

      For the last few weeks, I've been using one of my original ASUS VS239H(-P, I think) monitors with a 2012 non-retina MacBook Pro with Intel HD 4000 graphics, running a fairly clean install of the latest Ventura. So far this has been giving me no issues (so far as I can tell… there is still some unavoidable strain from other screens at work and LED lighting). I am tempted to try going back to a 2014 Mac Mini with an Intel iGPU, or just keep running one of my MacBooks with Intel iGPUs in clamshell mode.

      Interestingly, I just booted into my main data drive running Monterey (which was a migration from my 14" M1 Pro MacBook Pro originally) for the first time in about a month and a half, using roughly the same hardware, and I could instantly tell there was some kind of strain there. So, either I screwed up some macOS graphics settings on the 14" M1 Pro MacBook Pro while trying to stop the strain (and those settings migrated over), or else there was something inherent in the Monterey setup on that Apple silicon machine that came over with the migrated settings. This is consistent with my observation in another thread:

      macsforme When I happened to boot another disk with a clean Monterey installation on that same computer, it looked "calmer" than my main Monterey environment, making me wonder if some strain-inducing setting(s) were migrated back with my data (although plausibly, my condition could just be deteriorating).

        macsforme I tested tended to dither on DisplayPort outputs only

        You can make same test as I did

        1. Use only iGPU's motherboard output ( HDMI or DP, doesnt matter )
        2. Plug grapfic card into PCI-E ( which one - CPU or MB line, 3 or 4 gen, doesnt matter ) but keep iGPU monitor connection
        3. Set some apps for dGPU ( manual )

        When you use iGPU in such scheme, no eye-strain. When you will start dGPU associated application, eye-strain begins, when you minimize the application to tray, eye-strain gone 🙂

        It means, dGPU render videodata using dithering, and finally copy dithered results into iGPU. My test equipment is: z690p, 12600k, rtx3080ti

        macsforme just curious did that also apply to HDMI being dither free while dp dithered for the gtx 1080?

          jordan just curious did that also apply to HDMI being dither free while dp dithered for the gtx 1080?

          Yes, I just confirmed. This was on macOS High Sierra with the web drivers, but I did a few tests on Windows with a different setup and my impression was that the behavior is the same on Windows. The HDMI tests were performed with a standard bandwidth HDMI cable.

          Sample time blend frame of DisplayPort output:

          Sample time blend frame of HDMI output:

            macsforme wow that's crazy the port matters. I know someone for example did test dp off the quadro rtx 4000 (turing)gpu with the epiphan and he confirmed no dithering(I have the vid). I wonder if it's your cable that is triggering it due to lack of bandwidth maybe ?

            Do you plan to try more gpus? Intel says the a770 arc doesn't dither but who knows if that's even true

              jordan I wonder if it's your cable that is triggering it due to lack of bandwidth maybe ?

              This is quite possible, although I'm not sure that the math backs up this theory. The BlackMagic capture device itself is HDMI 2.0, and my DisplayPort to HDMI active cable is supposedly DisplayPort 1.4 (but the supported resolutions on the product page suggest that HDMI 2.0 is the bottleneck). For what it's worth, I did repeat most of the tests with an "Ultra High Speed HDMI" (48Gbps) cable, with identical results. The GTX 1080 also has HDMI 2.0, so with both ends supporting HDMI 2.0 as well as the cable, that should not be a bottleneck. I'd like to determine what the actual negotiated bandwidth is in macOS, but in any case this seems like plenty of margin for 1080p 60Hz (even with 10-bit color). The fact that we have to (actively) convert DisplayPort to HDMI or DVI (which appear to be very different signals) introduces a complexity that I wish we could avoid.

              jordan I know someone for example did test dp off the quadro rtx 4000 (turing)gpu with the epiphan and he confirmed no dithering(I have the vid).

              I am still trying to figure out how to identify dithering in an Epiphan capture. However, DVI and HDMI seem to be similar signals, and I only detected dithering over HDMI (with the Blackmagic device) on one machine to begin with, the MacBook Pro 15-inch Retina with AMD dGPU.

              jordan Do you plan to try more gpus?

              I am open to it. My primary goal was to find a Mac-compatible GPU with a DisplayPort output that doesn't dither, but so far I have not found one. I do have some interest in finding a safe GPU on the Windows side to use at work, however.

                macsforme I am still trying to figure out how to identify dithering in an Epiphan capture.

                What aspect do you need help with exactly?

                  JTL I have some captures, but I have not determined a method to detect whether dithering is present in them. I have an Epiphan capture of the one HDMI source I found to exhibit dithering in a Blackmagic capture (MacBook Pro 2015 15-inch Retina with AMD dGPU), but at least with the ffmpeg time blend method from the StillColor thread, no dithering was evident in it. However, the ffmpeg time blend method was set up for 10-bit RGB, so it may be an invalid comparison until the arguments are tweaked. I also briefly looked at VideoDiff but have not gone hands-on with it yet.

                  EDIT: I stumbled upon an interesting discovery in the Epiphan capture application. If I set Options->Display->Display format->RGB 8 bits per pixel (8 bits total, versus the standard 24 bits total), then I can watch the dithering visually in the preview window in real-time. I tried capturing a few different sources tonight, and the visual preview was consistent with my previous findings. That is, the above-mentioned Retina MBP with AMD dGPU dithers (also, new discovery, a recent Ubuntu live installer USB dithers on my GTX 1080 via DisplayPort); conversely, the same year Retina MBP with the Intel iGPU only, and the same GTX 1080 in Windows via HDMI, do not dither.

                  • JTL replied to this.

                    macsforme If you can find some way to send me the files and PM me I can investigate.

                    For those who experimented with capture cards: can we explain the phenomenon "eye strain only inside a specific app, even when it runs in a window"? To me it seems the drivers' temporal dithering is either enabled for the full screen or completely off, but nothing in between. Were you able to reveal temporal dithering only inside a specific window (e.g. Firefox, Chromium, VSCode, games, and anything else that is hardware-accelerated)? To detect this it may be required to turn off the desktop composition if possible.

                      KM I never found a specific application to be dithering, it was either the whole system dithering or none. Some applications which caused me strain were very blurry which I think was the cause, though in other applications there is no blurriness or anything visible to cause strain. I had wondered if some applications led to changes in my monitors behaviour. My monitors do use temporal dithering, and it is likely that different colours are dithered differently.

                        Seagull Prodeus is the only game that strains my eyes a lot and it has a specific dithering setting that can't be completely disabled.

                        2 months later

                        Seagull, can you please tell me how your adaptation is going?

                        How much do you use bad devices from the start? How much and how do you rest from it? How do you avoid the cumulative effect of symptoms? And finally, how can you evaluate that one device is acceptable for adaptation and the other isn't?

                        I had a bunch of laptops and PCs tested throughout this year, but whenever I crossed 5 or 6 days of testing I got so big pain and pressure that I needed to spend a week or two in the rest from any device

                        Seagull also about food intolerances and allergies: did you feel any eyestrain or migraine without using a devices?

                        I mean, did you experience symptoms after some food or after exposure to air with allergens, but not after using bad devices?

                        JTL this is only capture card that can use different EDID?

                        It's a little bit expensive compared to black magic devices 😂

                          dev