diop I log in today, and my response has been removed!

Almost like they're hiding something.. 🙁

Seagull Recently tested a couple low end Pentium laptops with 610 and 620 and they strained in minutes. If what you said is true seems dithering ain't the culprit. They were PWM free as well.

AGI
It is pretty hard to prove anything medical with a sample size of one. like smoking, asbestos, lead paint there will need to be research. In this case it may be possible to use the military research on non-lethal weapons as proof of harm potential.

My E-ink monitors show dithering, the panel refresh is slow enough that you can actually watch it. I can't recall if I have used 6th gen Intel specifically, but Intel has typically been the worst possible for dithering. There was Ditherig.exe that supposedly dissabled it on Intel, but that never worked on my set up. the only sure bet I have seen is Nvidia on Linux with the disable dithering setting on in the driver. Radeon used to work in older versions of Linux, but not any more.

    ShivaWind Could you please make a video of it? Im making research with one hacker group but we dont have eink displays. Please add me on skype or email me at mjanas555 at gmail dot com.

    Harrison Generally yes, but monitor choice has a big effect too. I am sensitive to polarisation of light also, and this plays a big role in my eye strain/migraines.

    ShivaWind I am afraid that may not be sufficient as a test for dithering. Driving E-ink displays is very different to driving an LCD. What you are seeing could just be an artefact of the conversion process being carried out by your e-ink screen.

      Seagull

      The E-ink monitors are a good way to detect dithering by the video card. the dithering that is seen on the E-ink screen can be toggled by setting the dithering on/off in the video card driver, proving it comes from the video card and not the monitor. Placing an inline recording device to “hide” the E-ink screen from the video card does not change this. I will make a video showing the dithering starting and stopping as dithering is enabled/disabled.

        ShivaWind

        Thats interesting. Which cards have you been able to test toggling dithering on/off? the only instance I have found of this is with ditherig.

        Seagull 6th gen Intel integrated graphics do not have temporal dithering, tested on windows 10. Intel are the only graphics I have tested which do not have temporal dithering.

        Laptop or Desktop? I have a Lenovo 6th Gen desktop (i3-6100T) with ditherig running, which I cannot use due to strain.

        Are you running latest Windows 10 and latest Intel driver on the machine? My understanding from Intel threads is dithering is enabled by default on 8bpc displays and above. The only machine I have ever seen ditherig work with IRL is a laptop. I have never had success on desktops.

          diop

          Desktop. Lastest W10 and driver as of when I did the testing. Capture card was set to 8bit 60hz. In addition, turning temporal dithering on using the Ditherig options resulted in my software detecting dithering. Without doubt, ditherig worked on this desktop pc.

          Different rules may apply on a laptop, where the intel chip may dither. I haven't tested laptops, but using ditherig on a laptop created banding consistent with going from 6bit+FRC to just 6bit.

          • vaz replied to this.
          • diop likes this.

            One thing I did find on Suguru's page (author of Ditherig) was a little Q&A section.

            About Dithering Settings for Intel Graphics

            It does not work and shows "Failed to load a DLL"
            It seems you run the version which does not match the OS version.
            Please run the 64bit version if your OS is 64bit.

            It does not work on newer versions of Windows 10.
            It seems Device Guard or Credential Guard prevents it from loading the kernel driver.
            Please turn off Hyper-V feature from Control Panel.

            I am interested to know how Amulet Hotkey made their fix, the kext must be signed by Apple or they paid for a cert, and the Windows fix must have been in collaboration with Nvidia.

            I don't think Amulet are going to email the dithering fixes any time soon, as it's their property and also requires their hardware to work. Who would they have contacted in Nvidia/AMD to produce the fixes? The frustrating thing here is it seems such a trivial fix (literally one line of code) but for whatever reason it is shrouded in secrecy.

            Is there any way through linkedin or otherwise to directly contact a dev from either company and ask for information on how to disable dithering?

            Seagull using ditherig on a laptop created banding consistent with going from 6bit+FRC to just 6bit.

            Laptop with what GPU?

              vaz

              HP laptop with a tn screen, 4th gen i3 with iris graphics. Banding after enabling ditherig was pretty clear. Its a tn screen, so its natively 6-bit, and as its modern, the intel gpu is driving the dithering - hence enabling ditherig produces the banding. As I have said before, in all the testing I have done I have found ditherig to be effective. However, in some cases I have found that intel integrated graphics do not dither by default, as was the case with a desktop pc with a 6th gen i5.

              Its seems likely to me that intel integrated graphics only uses temporal dithering on devices connected via eDP, which is the standard for internally connecting laptop screens. This being supported by my results thus far.

              I'll be experimenting with the HP laptop this week if you have anything specific you want me to do to it, though I can't capture at the moment as my capture card is 4hours away in my office.

                It found interestring discussion about deep color support in FirePro and Quadro:

                As much as I know, FirePro and Quadro drivers offer nonstandard
                proprietary ways to display 10-bit images inside windowed application
                over a "legacy" 8-bit desktop manager (thus the user isn't forced into
                DirectX or OpenGL fullscreen exclusive rendering modes and the
                developers don't need to create their own specialized user interface
                for that unique display mode).

                9 days later

                Seagull I have noticed (yet after all the visual therapy and prismatic glasses) that on my macbook pro 15in 2018, when AMD is in use its a lot better - so much that on better days it feels like old tech with no pain to me. Its hard to check, so I keep a window open that shows if the card in use is integrated or "high performance" (and have a high demanding app like photoshop open in the background to force the high performance card being in use). Only downside is that it has to be plugged, otherwise battery drains fast.

                Would you think then than when integrated intel card is used, it forces dithering because of the eDP connection, and the AMD does not?

                  ShivaWind I've actually noticed other strange things on e-ink that I think I mentioned somewhere else on here. One was that the mouse cursor can create a shadow to the right of it like a light was shining across the screen. This made me wonder if a lot of rendering on computers can be imperfect because they use vector driven graphics cards to draw. I've also wondered if it is the way things are refreshed as well that makes us sick. The updating of what's on the screen doesn't look particularly orderly when you watch it in slower motion on e-ink.

                  martin

                  With a dual GPU laptop its hard to say whats going on, as the Intel and AMD GPU will be working together to some extent. But I'd caution you against believing that because there is no discomfort it is not dithering. The monitor I am using at the moment is only comfortable when hooked up to my GTX660, which dithers. Different Nvidia cards induce strain, using the intel GPU which does not dither induces strain. Something about the superposition of this Benqs native tn dithering and the GTX660 dithering makes it comfortable. Its possible that both your Intel GPU and AMD GPU are both adding dithering, but the combination is comfortable to you.

                  10 days later

                  I found a patent on google entitled "Programmable dithering for video displays", there is information on dithering which was quite interesting. Link: https://patents.google.com/patent/US20100238193

                  However, depending on the algorithm used to vary an applied dither pattern from frame to frame, temporal dithering can cause the undesirable visible artifact known as “flicker.” Flicker results when dithering produces a sequence of pixels that are displayed at the same location on a display screen with periodically varying intensity, especially where the frequency at which the intensity varies is in a range to which the eye is very sensitive. The human eye is very sensitive to flicker that occurs at about 15 Hz, and more generally is sensitive to flicker in the range from about 4 Hz to 30 Hz (with increasing sensitivity from 4 Hz up to 15 Hz and decreasing sensitivity from 15 Hz up to 30 Hz). If the pixels displayed at the same screen location (with a frame rate of 60 Hz) have a repeating sequence of intensities (within a limited intensity range) that repeats every four frames due to dithering, a viewer will likely perceive annoying 15 Hz flicker, especially where each frame contains a set of identical pixels of this type that are displayed contiguously in a large region of the display screen. However, if the pixels displayed at the same screen location (with a frame rate of 60 Hz) have a repeating sequence of intensities (in the same intensity range) that repeats every sixteen frames, a viewer will be much less likely to perceive as flicker the resulting 3.75 Hz flicker.

                  In order to reduce the flicker caused by temporal dithering a repeating sequence of dither bits with a sufficiently long period of repetition can be used. However, until the present invention, dithering had not been implemented in a programmable manner that allows the user to vary both spatial and temporal dither parameters so as to reduce artifacts caused by the dither process itself or the interaction of dithering with other processes, such as pixel inversion.

                    dev