macsforme

So I am in the same situation and have the same BMD capture device and can only capture dithering from RTX laptop when ColorControl dithering is set to "Auto: 8-bit Temporal" and never when color control dithering is set to disabled. I am of the opinion that I need to find something like DVI2PCIe or some EDID device that goes before the bmd capture device in the signal chain otherwise I'm comparing apples to oranges as you said. I am also using a higher bandwidth HDMI to HDMI cable that in theory should allow dithering to occur.

I posted some EDID devices here and not sure if they are useful:
https://ledstrain.org/d/2589-products-to-try-or-avoid-pwm-flicker-and-temporal-dithering-testing/185

https://www.black-box.de/en-de/page/25843/Resources/Technical-Resources/Black-Box-Explains/multimedia/EDID-Emulation

Do you still see dithering from your video analysis when you disable dithering inside color control?

    async It will definitely NOT send the same to the capture card as to the monitor unless you spoof the EDID. You can use BetterDisplay to download the monitor EDID. I don't have a capture card myself, but it should be possible to spoof the EDID.

    This suggestion guided me in the right direction. I was able to spoof the EDID of one of my real monitors (with a macOS display EDID override file) to the BMD capture device, and it was still able to work and capture video. What I ultimately discovered was that regardless of EDID spoofing or not, my AMD Radeon Pro WX 5100 will only output a signal with temporal dithering if the capture device is connected at boot time. Disconnecting and reconnecting the capture device causes the output to no longer have temporal dithering until the next reboot… strange! Regardless, this resolves the main discrepancy I encountered regarding the WX 5100 GPU not seeming to have temporal dithering while the visual experience suggested otherwise. The fact that temporal dithering was only detected via DisplayPort and not via HDMI on all GPUs I tried (with the exception of my 2015 MacBook Pro 15" with an AMD dGPU, which exhibited it on both) was another interesting discrepancy which may yield helpful insights.

    JTL I might have more to add later, but one reason I went for the DVI2PCIe is you can program an arbitrary EDID binary to the card, and it remains persistent.

    If you mean the $1,400 DVI2PCIe Duo… ouch. 🙁 The Blackmagic device was $70 (used) on eBay, plus the cost of the Thunderbolt cable.

    Seagull I would be careful about making assessments about dithering by eye though.

    A valid point, and I completely agree. Correct me if I'm wrong, but my impression was that empirically measuring temporal dithering from a physical LCD panel requires sophisticated cameras/tools which are out of reach for most people. As far as visual assessment, I try to limit my reliance on it to getting a general sensation of image stability, and have that form the basis for further empirical investigation.

    photon78s Do you still see dithering from your video analysis when you disable dithering inside color control?

    I never got that far, as it looked like I needed to reboot my workstation to make ColorControl work fully, and I was unable to do so at the time. However, I expect I would have reached the same results that you did. My main concern for that workstation was that dithering is off on the normal display (which was what ColorControl seemed to indicate), and then I had secondary concerns that the BMD capture device may not be getting the same output that a standard monitor would (which now seem mostly resolved, at least on macOS).

    @DisplaysShouldNotBeTVs, are you willing to share your method for disabling temporal dithering on the AMD GPU on the MacBookPro11,5, as you've mentioned elsewhere? Now that I have a means of reliably measuring temporal dithering, I want to start trying to disable it, leveraging all possible methods.

      macsforme spoof the EDID

      do you have the EDID of a "safe" and "unsafe" monitor? Through CRU you can save settings to a file, for example

        simplex do you have the EDID of a "safe" and "unsafe" monitor? Through CRU you can save settings to a file, for example

        I have two displays (different models) corrected to this Mac Pro, both of which I consider "safe" although they are 6-bit+FRC so not ideal. Both generally seem to exhibit temporal dithering roughly equally, the degree of which varies based on the GPU currently installed. I plan to post my findings soon, but so far all GPUs exhibited temporal dithering detectable with the ffpmeg time blend method from the post I linked above. Some outputs were (subjectively) more or less comfortable than others.

          macsforme empirically measuring temporal dithering from a physical LCD panel requires sophisticated cameras/tools

          I've seen people do it high frame rate phone cameras, but I as I said before, I think the best way to do it would be a cheap optical sensor attached to a microphone jack, oscilloscope software to record the sensors input, and a Fourier transform to show all the different flicker frequencies present.

          2 months later

          macsforme I plan to post my findings soon

          The results of my investigation into temporal dithering on Intel-based macOS are as follows. This testing pertained to external outputs only (not built-in laptop displays) given that a capture card was used. The machines tested included a 2009 Mac Pro, 2015 15-inch Retina MacBook Pro (iGPU only and iGPU/AMD dGPU variants), 2012 13-inch MacBook Pro (non-Retina), and 2009 15-inch MacBook Pro. The GPUs tested in the Mac Pro included the original NVIDIA GT 120, an NVIDIA GTX 640, an NVIDIA GTX TITAN (Kepler), an NVIDIA GTX 1080 FE, and an AMD Radeon Pro WX 5100.

          Notably, the macOS version and firmware version of each machine did not appear to make any difference in the outcome, in the cases where several were tested. Most of the HDMI output testing was done with a standard HDMI cable, but I repeated several of the tests with a high-speed HDMI cable, and it seemed to make no difference on my hardware.

          2009 Mac Pro - All NVIDIA cards do have temporal dithering on DisplayPort outputs, and do not have temporal dithering on HDMI outputs. The AMD WX 5100 has temporal dithering on its DisplayPort outputs (which is the only type of output), but only when plugged in at boot time (not after hot-plugging the monitor). Having multiple monitors plugged in and hot-plugging them seemed to cause inconsistent results as far as dithering.

          2009 MacBook Pro & 2012 MacBook Pro - The former has an NVIDIA dGPU while the latter has an Intel iGPU; on mini-DisplayPort outputs, no temporal dithering was detected.

          2015 15-inch Retina MacBook Pro - The iGPU-only model (MacBookPro11,4) does not exhibit temporal dithering on either mini-DisplayPort nor HDMI outputs. The model with the AMD dGPU (MacBookPro11,5) does exhibit temporal dithering on both mini-DisplayPort and HDMI outputs.

            macsforme The model with the AMD dGPU (MacBookPro11,5) does exhibit temporal dithering on both mini-DisplayPort and HDMI outputs.

            Does this change if you try the Psychtoolbox kext + Octave commands dither disable method I posted? I've already confirmed this method affects flicker systemwide as I can see noticeable small changes to colors and reduced flicker on an old TN monitor I tried it on. (Unfortunately that monitor also has its own FRC on 8bit signals and macOS seems to not be capable of forcing a 6bit connection)

            So I'm curious if the PTB kext method actually stops the "GPU-level" temporal dithering entirely or if it only "reduces" it, before I go and try to search for a true 8bit monitor

              DisplaysShouldNotBeTVs Does this change if you try the Psychtoolbox kext + Octave commands dither disable method I posted?

              I have not yet had a chance to experiment with that method, but I am curious about that as well. I hope to look into it at some point so we'll have another empirical data point.

              5 days later

              jordan Btw in reply to a comment above. if you want that dvi2pcie cheap no need for the duo. Just get this one, it's missing the bracket but I'll also link one that works. Me and another user here have ordered from both of these links.

              Thanks for the suggestion. I picked up one of these capture cards and set it up over the weekend.

              Is anyone with one of these cards willing to share their method for detecting dithering with it? Is the "videodiff" program used for this? I tried with the ffmpeg time blend method, but so far I could not detect dithering after trying several configurations (including the 2015 15-inch Retina MacBook Pro with AMD dGPU, which did exhibit dithering on the BlackMagic capture card). The ffmpeg time blend command appears set up for 10-bit video, whereas the Epiphan DVI2PCIe card seems to capture 8-bit video, so that could be the difference or simply that DVI output does not dither on these cards.

                macsforme Remember that if you're only connecting to a capture card, it won't provide the same EDID that an actual monitor would which will sometimes cause totally different output behavior.

                I would suggest dumping a realistic EDID from an actual monitor (using BetterDisplay's save EDID function etc), then using one of the "force EDID" methods in order to tell macOS to substitute in the capture card's EDID with the monitor's one instead. then see if the output changes.

                especially the bit depth and YCbCr support components of an EDID, in addition to the fact a lot of monitors actually provide color calibration information through their EDID (this is what causes that profile with the name of the monitor to show up in display settings) totally affects how much a GPU will decide to dither

                • JTL replied to this.

                  I have discovered that UEFI has its own video driver configuration - called UEFI GOP. If I understand the idea correctly, UEFI GOP works based on the EDID data of the monitor, imagine if monitor is 8 bit + FRC, you disable 10 bit in windows / linux to get pure 8 bit signal, and UEFI continues to scale it to 10 bit at its own level.

                  What if UEFI GOP behavior of motherboards, someone call here "non-disabled dithering" of video cards?

                  DisplaysShouldNotBeTVs I would suggest dumping a realistic EDID from an actual monitor (using BetterDisplay's save EDID function etc), then using one of the "force EDID" methods in order to tell macOS to substitute in the capture card's EDID with the monitor's one instead. then see if the output changes.

                  Actually. I wouldn't trust forcing the EDID on the target device. Forcing EDID can have different functionality compared to native EDID detection.

                  The DVI2PCIe is one of the few capture cards that can be programmed to output an arbitrary EDID.

                  Good points. I understand the goal of replicating the exact configuration of a real display for accurate testing/comparison. My concern with programming an arbitrary EDID would be that the EDID advertises the capabilities of the display (or capture card), so I could end up with a video format that the capture card is not actually compatible with.

                  Epiphan’s web site has EDID files you can download to force certain resolutions if you don’t otherwise have a way to do so. From what I gather, this seems to be the main purpose of the EDID upload feature.

                  • JTL replied to this.

                    macsforme Good points. I understand the goal of replicating the exact configuration of a real display for accurate testing/comparison. My concern with programming an arbitrary EDID would be that the EDID advertises the capabilities of the display (or capture card), so I could end up with a video format that the capture card is not actually compatible with.

                    I think that concern is partially unfounded. As long as the color format being output by the target device is supported by the capture card it should work. And if not, just reprogram the "stock" EDID and you're no worse than what you started with.

                    I tried with the ffmpeg time blend method, but so far I could not detect dithering after trying several configurations (including the 2015 15-inch Retina MacBook Pro with AMD dGPU, which did exhibit dithering on the BlackMagic capture card).

                    If you send me some files and information in PM. I might be able to help.

                    13 days later

                    macsforme All NVIDIA cards do have temporal dithering on DisplayPort outputs, and do not have temporal dithering on HDMI outputs.

                    macsforme The iGPU-only model (MacBookPro11,4) does not exhibit temporal dithering on either mini-DisplayPort nor HDMI outputs.

                    After further reflection, the before-and-after of my vision strain is starting to make sense (if temporal dithering is the cause), given my finding that GPUs I tested tended to dither on DisplayPort outputs only and not on HDMI (and I assume not on DVI).

                    Years ago I ran dual 8-bit (purportedly) displays on a 2014 Mac Mini, which (if consistent with my iGPU MacBook Pro) would not have had dithering on the Intel iGPU outputs. I later upgraded to a Mac Pro (with various GPUs over time) and a 3-monitor setup, but my standard refresh rate monitors (same purportedly 8-bit ones) were on either HDMI or DVI, while the DP output went to a high-refresh gaming monitor (likely reducing the effects of dithering on that output).

                    Later, after ditching the horrible 14" M1 Pro MacBook Pro, I ran my Mac Pro with an AMD Radeon Pro WX 5100 (4x DP outputs) with DisplayPort going to all three monitors (including the standard refresh rate ones). I also tried some of my NVIDIA cards using the DisplayPort outputs. After that, I never had 100% of the comfort level I had previously.

                    For the last few weeks, I've been using one of my original ASUS VS239H(-P, I think) monitors with a 2012 non-retina MacBook Pro with Intel HD 4000 graphics, running a fairly clean install of the latest Ventura. So far this has been giving me no issues (so far as I can tell… there is still some unavoidable strain from other screens at work and LED lighting). I am tempted to try going back to a 2014 Mac Mini with an Intel iGPU, or just keep running one of my MacBooks with Intel iGPUs in clamshell mode.

                    Interestingly, I just booted into my main data drive running Monterey (which was a migration from my 14" M1 Pro MacBook Pro originally) for the first time in about a month and a half, using roughly the same hardware, and I could instantly tell there was some kind of strain there. So, either I screwed up some macOS graphics settings on the 14" M1 Pro MacBook Pro while trying to stop the strain (and those settings migrated over), or else there was something inherent in the Monterey setup on that Apple silicon machine that came over with the migrated settings. This is consistent with my observation in another thread:

                    macsforme When I happened to boot another disk with a clean Monterey installation on that same computer, it looked "calmer" than my main Monterey environment, making me wonder if some strain-inducing setting(s) were migrated back with my data (although plausibly, my condition could just be deteriorating).

                      macsforme I tested tended to dither on DisplayPort outputs only

                      You can make same test as I did

                      1. Use only iGPU's motherboard output ( HDMI or DP, doesnt matter )
                      2. Plug grapfic card into PCI-E ( which one - CPU or MB line, 3 or 4 gen, doesnt matter ) but keep iGPU monitor connection
                      3. Set some apps for dGPU ( manual )

                      When you use iGPU in such scheme, no eye-strain. When you will start dGPU associated application, eye-strain begins, when you minimize the application to tray, eye-strain gone 🙂

                      It means, dGPU render videodata using dithering, and finally copy dithered results into iGPU. My test equipment is: z690p, 12600k, rtx3080ti

                      macsforme just curious did that also apply to HDMI being dither free while dp dithered for the gtx 1080?

                        jordan just curious did that also apply to HDMI being dither free while dp dithered for the gtx 1080?

                        Yes, I just confirmed. This was on macOS High Sierra with the web drivers, but I did a few tests on Windows with a different setup and my impression was that the behavior is the same on Windows. The HDMI tests were performed with a standard bandwidth HDMI cable.

                        Sample time blend frame of DisplayPort output:

                        Sample time blend frame of HDMI output:

                          macsforme wow that's crazy the port matters. I know someone for example did test dp off the quadro rtx 4000 (turing)gpu with the epiphan and he confirmed no dithering(I have the vid). I wonder if it's your cable that is triggering it due to lack of bandwidth maybe ?

                          Do you plan to try more gpus? Intel says the a770 arc doesn't dither but who knows if that's even true

                            dev