Looking at the Intel support forums (Windows), there's lots of reports of incorrect color depth being detected for displays, and users complaining they can't select 10/12 bit color. Also a few requests to disable dithering.

Thread - https://forums.intel.com/s/question/0D50P00004VyUR0SAN/10bit-bit-depth-hevc-with-iris-plus-graphics-650?language=en_US

Our driver supports Color Depths of 8-bit or 12-bit via HDMI*.
Currently, if a 10-bit display is used the driver will default to 8-bit with Dithering or 12-bit if supported.
Please refer to Deep Color Support of Intel Graphics White Paper. (Page 11)
There is already a request to allow users to manually select desired Color Depth via IGCC (Intel® Graphics Command Center), but this is a work in progress with no ETA however it is in our Top priority list.
The above doesn't impact at all the Encoding/Decoding capabilities of the graphics controller. HEVC 10-bit video encoding/decoding via Hardware is supported by the graphics controller.

There was another post I saw, and forget the link, but the support answer was that the driver automatically enables dithering for 8Bit and above, and disables for 6bit. Which seems backwards, as a poster pointed out. Is there a way to spoof or rewrite an EDID to be detected as 6bit?

Intel Deep Color White Paper - https://www.intel.com/content/dam/www/public/us/en/documents/white-papers/deep-color-support-graphics-paper.pdf

Most of the traditional laptop panels were internally of 6bpc (=18bpp panels). This naturally means even a normal 24bpp image can show dithering when viewed on an 18bpp panel. To avoid this, a process called “dithering” is applied which is almost like introducing a small noise to adjoining pixel. This will create variations in 18bit representation and results in hiding color banding on such panels. Either the GPU’s display HW or panel itself might do this. When a panel does this, source (GPU display HW) is not aware of the same and panel will advertise itself as a normal 8bpc (24bit) panel

So I gather from this that the reason dithering is always enabled is a failsafe to avoid a poor 6bit+frc implementation. Which in a way makes sense because there could be very cheap monitors out there that are advertised as 8bit but are 6bit+frc (as we know), so the dithering at the GPU side is ensuring a consistent 8bit output across all monitors.

Windows* OS doesn’t support more than 8 bpc for desktop. This means even if an application has a more than 8 bpc content, it will all be compressed to 8 bpc during desktop window composition process as shown in figure below

So perhaps all these applications that are starting to cause strain are designed with HDR in mind, however they are being dithered down to 8bpc?

In many respects I can understand why dithering is enabled. It's much easier to just force all displays to 8-bit rather than have the possibility of an incorrectly detected monitor/color combo. However I think that advanced controls should still be available to the consumer as dithering on a true 8-bit monitor will produce extra noise when it isn't needed. So I'm not getting what I pay for as a consumer. So eye strain or not, dithering isn't needed if the monitor is detected correctly and correct color range is selected.

I have an mri showing recent tissue damage in my brain. flicker/strobing is being used as a non-lethal weapon by the military. if a good lawyer got on this, I am sure there is or will be a solid case. I think you could possibly just prove dithering is functionally equivalent to the military LED-incapacitator which is known to be harmful. we just need to make sure that it is provable that the video card and monitor manufactures are made aware of this. then it would be wilful disregard for safety rather than negligence.

    ShivaWind They definately are aware dithering is used, and it's also documented everywhere I look online. Most gamers want it enabled! The fact that Amulet Hotkey has made tools to disable it tells me it must be on everywhere. It never specifically mentions 'flicker' in any definition of dithering I see online, only 'changes the color representation' or 'adjusts nearby pixels to smooth gradients'. Most posts online agree that dithering is a 'down and dirty' method and isn't a perfect solution.

    ShivaWind I have an mri showing recent tissue damage in my brain.

    First, sorry about it! Question: could you prove it is due to dithering / flickering of displays? I have been in a nightmare for almost two years, I have never been so unwell because of electronics and lighting. Yet, my MRI, taken 2 months ago, was immaculate. I was told I have a very healthy brain. If you have not followed the post in which I talked about it, basically my ophthalmologist thought the MRI could explain the twitching of my left eye (eyelid and face muscles). It could not, though.

      I posted on the Intel forums yesterday. They have a thread to suggest feature requests for Intel Graphics Command Center in Windows. I added a reponse to the thread, agreeing with a previous reply to allow user-selectable colour depth, but also to allow enable/disable dithering. I log in today, and my response has been removed! I double-checked and am sure it appeared immediately after posting yesterday, I don't think posts are approved before they get added.

      I'm a little annoyed with this. The Intel devs have an IRC channel (#intel-gfx) which they frequent, however it is for general graphics talk only, not submitting bugs/features. I believe it is Linux only. Also I suppose one could politely ask the question in IRC "is temporal dithering enabled with Intel drivers?".

      The thread is here > https://forums.intel.com/s/question/0D50P00004H90FcSAJ/intel-gcc-display-media-feedback

      Please have your say and ask for user controllable colour output settings, and dithering checkbox/option.

        diop I log in today, and my response has been removed!

        Almost like they're hiding something.. 🙁

        Seagull Recently tested a couple low end Pentium laptops with 610 and 620 and they strained in minutes. If what you said is true seems dithering ain't the culprit. They were PWM free as well.

        AGI
        It is pretty hard to prove anything medical with a sample size of one. like smoking, asbestos, lead paint there will need to be research. In this case it may be possible to use the military research on non-lethal weapons as proof of harm potential.

        My E-ink monitors show dithering, the panel refresh is slow enough that you can actually watch it. I can't recall if I have used 6th gen Intel specifically, but Intel has typically been the worst possible for dithering. There was Ditherig.exe that supposedly dissabled it on Intel, but that never worked on my set up. the only sure bet I have seen is Nvidia on Linux with the disable dithering setting on in the driver. Radeon used to work in older versions of Linux, but not any more.

          ShivaWind Could you please make a video of it? Im making research with one hacker group but we dont have eink displays. Please add me on skype or email me at mjanas555 at gmail dot com.

          Harrison Generally yes, but monitor choice has a big effect too. I am sensitive to polarisation of light also, and this plays a big role in my eye strain/migraines.

          ShivaWind I am afraid that may not be sufficient as a test for dithering. Driving E-ink displays is very different to driving an LCD. What you are seeing could just be an artefact of the conversion process being carried out by your e-ink screen.

            Seagull

            The E-ink monitors are a good way to detect dithering by the video card. the dithering that is seen on the E-ink screen can be toggled by setting the dithering on/off in the video card driver, proving it comes from the video card and not the monitor. Placing an inline recording device to “hide” the E-ink screen from the video card does not change this. I will make a video showing the dithering starting and stopping as dithering is enabled/disabled.

              ShivaWind

              Thats interesting. Which cards have you been able to test toggling dithering on/off? the only instance I have found of this is with ditherig.

              Seagull 6th gen Intel integrated graphics do not have temporal dithering, tested on windows 10. Intel are the only graphics I have tested which do not have temporal dithering.

              Laptop or Desktop? I have a Lenovo 6th Gen desktop (i3-6100T) with ditherig running, which I cannot use due to strain.

              Are you running latest Windows 10 and latest Intel driver on the machine? My understanding from Intel threads is dithering is enabled by default on 8bpc displays and above. The only machine I have ever seen ditherig work with IRL is a laptop. I have never had success on desktops.

                diop

                Desktop. Lastest W10 and driver as of when I did the testing. Capture card was set to 8bit 60hz. In addition, turning temporal dithering on using the Ditherig options resulted in my software detecting dithering. Without doubt, ditherig worked on this desktop pc.

                Different rules may apply on a laptop, where the intel chip may dither. I haven't tested laptops, but using ditherig on a laptop created banding consistent with going from 6bit+FRC to just 6bit.

                • vaz replied to this.
                • diop likes this.

                  One thing I did find on Suguru's page (author of Ditherig) was a little Q&A section.

                  About Dithering Settings for Intel Graphics

                  It does not work and shows "Failed to load a DLL"
                  It seems you run the version which does not match the OS version.
                  Please run the 64bit version if your OS is 64bit.

                  It does not work on newer versions of Windows 10.
                  It seems Device Guard or Credential Guard prevents it from loading the kernel driver.
                  Please turn off Hyper-V feature from Control Panel.

                  I am interested to know how Amulet Hotkey made their fix, the kext must be signed by Apple or they paid for a cert, and the Windows fix must have been in collaboration with Nvidia.

                  I don't think Amulet are going to email the dithering fixes any time soon, as it's their property and also requires their hardware to work. Who would they have contacted in Nvidia/AMD to produce the fixes? The frustrating thing here is it seems such a trivial fix (literally one line of code) but for whatever reason it is shrouded in secrecy.

                  Is there any way through linkedin or otherwise to directly contact a dev from either company and ask for information on how to disable dithering?

                  Seagull using ditherig on a laptop created banding consistent with going from 6bit+FRC to just 6bit.

                  Laptop with what GPU?

                    vaz

                    HP laptop with a tn screen, 4th gen i3 with iris graphics. Banding after enabling ditherig was pretty clear. Its a tn screen, so its natively 6-bit, and as its modern, the intel gpu is driving the dithering - hence enabling ditherig produces the banding. As I have said before, in all the testing I have done I have found ditherig to be effective. However, in some cases I have found that intel integrated graphics do not dither by default, as was the case with a desktop pc with a 6th gen i5.

                    Its seems likely to me that intel integrated graphics only uses temporal dithering on devices connected via eDP, which is the standard for internally connecting laptop screens. This being supported by my results thus far.

                    I'll be experimenting with the HP laptop this week if you have anything specific you want me to do to it, though I can't capture at the moment as my capture card is 4hours away in my office.

                      dev