Taking Action
- Edited
diop how do you all work right now? Don't you get really bad symptoms?
It has been awful for the past year and a half. I go home torn and I start in the morning pretty much in the same condition. My productivity has become very low in spite of all my efforts. I thought of changing job, but not understanding what causes my issues, and the sensitivity I developed to modern lighting in addition to electronic devices make me hesitate. I am completely lost. I have not received any real support nor inspiration from any of the medical doctors and practitioners I visited. Recently I have been spending a fortune on massages of various type and acupuncture to release neck and face tension, but if there is any relief it is only temporary. Exposure to a bad device restore the bad symptoms within few minutes. The worst is probably eyelids and facial spasms.
I have not been yet to an orthoptics specialist, because here that science is considered "folk remedy". I will try my luck in Europe during the Christmas break. That is my last option.
My problems date before 2012. I used my ThinkPad T60 on XP from 2010 to 2017 day and night no problem, but from 2007 to 2010 I kept it in a drawer thinking it was unusable. What made the difference was changing the display resolution from native to 1024x768. Same hardware, same software. Not a matter of font size, because I ran that test too. At native resolution my eyes fried, at lower resolution I could work for hours without a break.
My eyes or brain are not able to cope with external optical impulses the vast majority of people are fine with. That is my only conclusion.
With regard to making our discomfort more public, maybe one option is to support already existing campaigns, e.g., https://lightaware.org/about/individual-stories/.
That is what one of the ambassador wrote me months ago: "We are collecting case studies of the way that lighting affects different people to show how many people are affected and how it impacts on their daily lives. The idea is that if we can collect hundreds of similar stories then we can show medics and researchers that this is a problem which is affecting a diverse range of people, and also potentially provide case studies for researchers."
In my opinion there are many people affected. We are not just a few hundreds. We should try to have sufferers converge on one or few platforms rather than spreading them out?
- Edited
Looking at the Intel support forums (Windows), there's lots of reports of incorrect color depth being detected for displays, and users complaining they can't select 10/12 bit color. Also a few requests to disable dithering.
Our driver supports Color Depths of 8-bit or 12-bit via HDMI*.
Currently, if a 10-bit display is used the driver will default to 8-bit with Dithering or 12-bit if supported.
Please refer to Deep Color Support of Intel Graphics White Paper. (Page 11)
There is already a request to allow users to manually select desired Color Depth via IGCC (Intel® Graphics Command Center), but this is a work in progress with no ETA however it is in our Top priority list.
The above doesn't impact at all the Encoding/Decoding capabilities of the graphics controller. HEVC 10-bit video encoding/decoding via Hardware is supported by the graphics controller.
There was another post I saw, and forget the link, but the support answer was that the driver automatically enables dithering for 8Bit and above, and disables for 6bit. Which seems backwards, as a poster pointed out. Is there a way to spoof or rewrite an EDID to be detected as 6bit?
Intel Deep Color White Paper - https://www.intel.com/content/dam/www/public/us/en/documents/white-papers/deep-color-support-graphics-paper.pdf
Most of the traditional laptop panels were internally of 6bpc (=18bpp panels). This naturally means even a normal 24bpp image can show dithering when viewed on an 18bpp panel. To avoid this, a process called “dithering” is applied which is almost like introducing a small noise to adjoining pixel. This will create variations in 18bit representation and results in hiding color banding on such panels. Either the GPU’s display HW or panel itself might do this. When a panel does this, source (GPU display HW) is not aware of the same and panel will advertise itself as a normal 8bpc (24bit) panel
So I gather from this that the reason dithering is always enabled is a failsafe to avoid a poor 6bit+frc implementation. Which in a way makes sense because there could be very cheap monitors out there that are advertised as 8bit but are 6bit+frc (as we know), so the dithering at the GPU side is ensuring a consistent 8bit output across all monitors.
Windows* OS doesn’t support more than 8 bpc for desktop. This means even if an application has a more than 8 bpc content, it will all be compressed to 8 bpc during desktop window composition process as shown in figure below
So perhaps all these applications that are starting to cause strain are designed with HDR in mind, however they are being dithered down to 8bpc?
In many respects I can understand why dithering is enabled. It's much easier to just force all displays to 8-bit rather than have the possibility of an incorrectly detected monitor/color combo. However I think that advanced controls should still be available to the consumer as dithering on a true 8-bit monitor will produce extra noise when it isn't needed. So I'm not getting what I pay for as a consumer. So eye strain or not, dithering isn't needed if the monitor is detected correctly and correct color range is selected.
I have an mri showing recent tissue damage in my brain. flicker/strobing is being used as a non-lethal weapon by the military. if a good lawyer got on this, I am sure there is or will be a solid case. I think you could possibly just prove dithering is functionally equivalent to the military LED-incapacitator which is known to be harmful. we just need to make sure that it is provable that the video card and monitor manufactures are made aware of this. then it would be wilful disregard for safety rather than negligence.
ShivaWind They definately are aware dithering is used, and it's also documented everywhere I look online. Most gamers want it enabled! The fact that Amulet Hotkey has made tools to disable it tells me it must be on everywhere. It never specifically mentions 'flicker' in any definition of dithering I see online, only 'changes the color representation' or 'adjusts nearby pixels to smooth gradients'. Most posts online agree that dithering is a 'down and dirty' method and isn't a perfect solution.
- Edited
ShivaWind I have an mri showing recent tissue damage in my brain.
First, sorry about it! Question: could you prove it is due to dithering / flickering of displays? I have been in a nightmare for almost two years, I have never been so unwell because of electronics and lighting. Yet, my MRI, taken 2 months ago, was immaculate. I was told I have a very healthy brain. If you have not followed the post in which I talked about it, basically my ophthalmologist thought the MRI could explain the twitching of my left eye (eyelid and face muscles). It could not, though.
- Edited
I posted on the Intel forums yesterday. They have a thread to suggest feature requests for Intel Graphics Command Center in Windows. I added a reponse to the thread, agreeing with a previous reply to allow user-selectable colour depth, but also to allow enable/disable dithering. I log in today, and my response has been removed! I double-checked and am sure it appeared immediately after posting yesterday, I don't think posts are approved before they get added.
I'm a little annoyed with this. The Intel devs have an IRC channel (#intel-gfx) which they frequent, however it is for general graphics talk only, not submitting bugs/features. I believe it is Linux only. Also I suppose one could politely ask the question in IRC "is temporal dithering enabled with Intel drivers?".
The thread is here > https://forums.intel.com/s/question/0D50P00004H90FcSAJ/intel-gcc-display-media-feedback
Please have your say and ask for user controllable colour output settings, and dithering checkbox/option.
- Edited
6th gen Intel integrated graphics do not have temporal dithering, tested on windows 10. Intel are the only graphics I have tested which do not have temporal dithering.
- Edited
AGI
It is pretty hard to prove anything medical with a sample size of one. like smoking, asbestos, lead paint there will need to be research. In this case it may be possible to use the military research on non-lethal weapons as proof of harm potential.
My E-ink monitors show dithering, the panel refresh is slow enough that you can actually watch it. I can't recall if I have used 6th gen Intel specifically, but Intel has typically been the worst possible for dithering. There was Ditherig.exe that supposedly dissabled it on Intel, but that never worked on my set up. the only sure bet I have seen is Nvidia on Linux with the disable dithering setting on in the driver. Radeon used to work in older versions of Linux, but not any more.
Harrison Generally yes, but monitor choice has a big effect too. I am sensitive to polarisation of light also, and this plays a big role in my eye strain/migraines.
ShivaWind I am afraid that may not be sufficient as a test for dithering. Driving E-ink displays is very different to driving an LCD. What you are seeing could just be an artefact of the conversion process being carried out by your e-ink screen.
The E-ink monitors are a good way to detect dithering by the video card. the dithering that is seen on the E-ink screen can be toggled by setting the dithering on/off in the video card driver, proving it comes from the video card and not the monitor. Placing an inline recording device to “hide” the E-ink screen from the video card does not change this. I will make a video showing the dithering starting and stopping as dithering is enabled/disabled.
- Edited
Seagull 6th gen Intel integrated graphics do not have temporal dithering, tested on windows 10. Intel are the only graphics I have tested which do not have temporal dithering.
Laptop or Desktop? I have a Lenovo 6th Gen desktop (i3-6100T) with ditherig running, which I cannot use due to strain.
Are you running latest Windows 10 and latest Intel driver on the machine? My understanding from Intel threads is dithering is enabled by default on 8bpc displays and above. The only machine I have ever seen ditherig work with IRL is a laptop. I have never had success on desktops.
Desktop. Lastest W10 and driver as of when I did the testing. Capture card was set to 8bit 60hz. In addition, turning temporal dithering on using the Ditherig options resulted in my software detecting dithering. Without doubt, ditherig worked on this desktop pc.
Different rules may apply on a laptop, where the intel chip may dither. I haven't tested laptops, but using ditherig on a laptop created banding consistent with going from 6bit+FRC to just 6bit.
One thing I did find on Suguru's page (author of Ditherig) was a little Q&A section.
About Dithering Settings for Intel Graphics
It does not work and shows "Failed to load a DLL"
It seems you run the version which does not match the OS version.
Please run the 64bit version if your OS is 64bit.It does not work on newer versions of Windows 10.
It seems Device Guard or Credential Guard prevents it from loading the kernel driver.
Please turn off Hyper-V feature from Control Panel.
I am interested to know how Amulet Hotkey made their fix, the kext must be signed by Apple or they paid for a cert, and the Windows fix must have been in collaboration with Nvidia.
I don't think Amulet are going to email the dithering fixes any time soon, as it's their property and also requires their hardware to work. Who would they have contacted in Nvidia/AMD to produce the fixes? The frustrating thing here is it seems such a trivial fix (literally one line of code) but for whatever reason it is shrouded in secrecy.
Is there any way through linkedin or otherwise to directly contact a dev from either company and ask for information on how to disable dithering?