http://www.lagom.nl/lcd-test/black.php

(also contains other monitor tests)

I appear not to notice anything on my Macbook Pro display, although that could be caused by

a) Me not knowing what to look for
b) My vision being too poor to notice
c) This display does not use temporal dithering

I'm thinking it's c) as I used to use a ASUS VE278Q monitor above 90% brightness to avoid PWM (vision got worse due to high brightness) and I know from the manual it uses a 6-bit TN panel and FRC (Temporal Dithering) and could see the "snow effect' almost always at Windows desktop using certain backgrounds. Never seemed to bother me though.

I'd be interested in hearing what others think, and most importantly if anyone has caputred the snow artifact on camera, or whatnot

JTL

24 days later

JTL JTL Another option may appear as default setting: "auto". Which makes it hard to know if that means on or off. Some time ago I talked to the devs in Nouveau's IRC channel. They checked how it's done and said Nouveau's temporal dithering option is always disabled, except for laptop type displays. IIRC for such displays color depth checks are done to see if temporal dithering is needed to create the illusion of 8-bit colors.

4 days later

I found a post on Internet that indicates poor dithering algorithm would cause eye strain:

Is this dithering noticeable? Will a true 24bits display outperform a 18bits one in (2D) picture quality or is this difference neglectable?

I am afraid that depends on the dithering algorithm used by the manufacturer. For example AUO 18 bit TFT Panels have an unpleasant "flashing" effect due to a poor dithering algorithm. Samsungs 172x is an 18 bit panel yet uses an excellent dithering algorithm making it better then some full 24 bit panels. After all some 24 bit panels reproduce color so poorly that an 18 bit panel with good dithering will produce far better color. It all depends on the individual TFT panels and how well they reproduce color or use dithering.

Source

  • JTL replied to this.

    vinkenvvt That source is quite old (~2004). I wonder if this still holds true today and furthermore with the fact if the dithering is done in software (by graphics drivers) or hardware. I believe the fact that low quality dithering methods can cause problems, but on recent Macbook's such as mine I don't think I have had any problems.

    More research is needed.

    For example I believe many desktop TN panels implement dithering in hardware, such as my old ASUS monitor that I ran at high brightness to mitigate PWM, it gave me no other problems in the two years of using it before it's failure, although I sometimes could see a slight snow effect on certain backgrounds which doesn't seem to manifest on my Macbook (maybe because have higher resolution/better implementation on Samsung panel/different GPU setup?)

    When I was talking to the head of displays for Apple, he mentioned the fact that laptop displays have more pins compared to let's say HDMI or VGA because they are just "dumb" panels that can't do much on their own, relies on external driver/controller board to handle brightness/PWM/connect to GPU, etc. All they do is display frames.

    I found something :
    In linux kernel gpu source code: Line 188

    Source

    188         /* Set the dithering flag on LVDS as needed, note that there is no
    189          * special lvds dither control bit on pch-split platforms, dithering is
    190          * only controlled through the PIPECONF reg. */
    191         if (INTEL_INFO(dev)->gen == 4) {
    192         /* Bspec wording suggests that LVDS port dithering only exists
    193          * for 18bpp panels. */
    194                 if (crtc->config->dither && crtc->config->pipe_bpp == 18)
    195                         temp |= LVDS_ENABLE_DITHER;
    196                 else
    197                         temp &= ~LVDS_ENABLE_DITHER;
    198         }

    Yes, they did this when display is 6 bit (or 18 bit), that causes eyestrain.

    vinkenvvt Did you remove the link of the linux kernel source for Intel graphics driver? (dithering) I was going to look into it. 🙁

    2 years later

    I just found this thread about oscilloscope + photodiode measurements, in which is info about detecting temporal dithering:
    http://www.eevblog.com/forum/testgear/beginner-needs-to-record-dignals-from-a-display-backlit/?all

    In posts #22 and #25 he talks about dithering and says that by analyzing the frequency spectrum one could see if (monitor panel's) temporal dithering is active. He says it appears as a spike at 15 Hz. I don't really understand this yet, as I have yet to learn the very basics about frequency spectrums and spectrum analyzing, but if it's true, it might be a very easy and perhaps reliable test for temporal dithering that we could add to our oscilloscope setup.

    Maybe for some of you this is more clear when you read it than it is to me? This is stuff that wasn't taught at school. I also don't understand how a refresh rate of 60 Hz would lead to just 30 Hz polarity inversion (shouldn't it be 60 Hz, too?). I have a feeling understanding all this is vital to understand possible eye strain effects of temporal dithering.

    It could also be that only FRC (display hardware's own, internal dithering) appears as 15 Hz flicker while external temporal dithering (graphics hardware, graphics driver) would be sent inside the normal 60 Hz video signal, thus becoming 30 Hz flicker(?). Further meaning those two could interfere with each other...

    • JTL replied to this.
    • JTL likes this.

      KM I'll read through this later

      6 days later

      On Linux, we have other choices for drivers (like modesetting or fbdev). Does anyone know of alternative drivers for Windows (that's not the VGA driver, since it doesn't support external monitor)?

      • JTL replied to this.

        ryans No. Better to get Intel, et al to fix the root cause if you ask me.

        Been trying to setup an OS X virtual machine on my desktop so I can speed up development of my GPU driver fix for Macboks (so I don't have to borrow my mother's MBP). Grumble grumble.

        I already have a 2015 MBP of my own, you just need one physical OS X machine that has the GPU for development, the other machine for debugging can be a VM.

        6 days later

        ryans Yes. I'm looking into a possible solution that should allow you to disable dithering on WIndows/OS X and Linux with AMD graphics. No promises 🙂

        Hi,

        A search for "dithering" on Android source code shows several results. The nice thing is the it shows the email address of the person that committed the code (if you click on History, it's probably possible to access `git blame' somehow). Many of the source authors are Google employees.

        So if we find a promising git commit (some are not our issue, like ImageMagick), we can email the author, and ask what they know about temporal dithering on Android or other sources of flicker. We can confirm on LinkedIn if the employee is still at Google (or just email anyway and hope they are).

        Apple would probably consider dithering as an Accessibility issue, but I don't know about Google.

        If there's one thing I've learned on this forum is that our problem is largely unknown. We must advocate for ourselves.

        http://androidxref.com/8.0.0_r4/search?q=dithering&defs=&refs=&path=&hist=&project=art&project=bionic&project=bootable&project=build&project=cts&project=dalvik&project=developers&project=development&project=device&project=docs&project=external&project=frameworks&project=hardware&project=kernel&project=libcore&project=libnativehelper&project=packages&project=pdk&project=platform_testing&project=prebuilts&project=sdk&project=system&project=test&project=toolchain&project=tools

          dev