- Edited
A bit off topic: please like this post (button below for registered members) if you want to participate in new Survey
A bit off topic: please like this post (button below for registered members) if you want to participate in new Survey
Just to add a comment on this, I had formal somatics training (Alexander Technique mainly & other related practices) the 5 last years due to a lower back problem. I believe that my neck and whole body are way more "relaxed" or better said properly toned than the average office worker that has not undergone such training.
I do believe that this is a tech issue. There is something in the technology that creates the problem to a minority of people. I can vividly sense how most of the new displays tense up my eyes & neck and while i can partially control it and reduce the symptoms, the stimulus that is causing this is still there. It is like feeding a healthy person every day with the worst possible food and expect this person not to become sick.
kammerer So main point that root cause could be in neck muscle and blood channels that restrict proper blood feeding.
You adopt the same posture on any computer workstation so unless there's a physical reason I don't know how bloodflow could be affected by flickering.
One thing that is clear is that we are a minority. Nobody in my immediate family has issues, nor do friends IRL or people I know via Social Media.
However I am the only person I know of who has strabismus (heterotropia) and a significantly high prescription with prism correction (at least -8 in each eye). In my life I would say I've encountered less than 5 people with visible eye misalignment. My eye doctor has always said I am a 'unique case' - well, we all like to be unique at something .
It's not as if there are swarms of people having seizures or complaining of headaches using the latest tech, if there was a severe epidemic with users across the globe these issues would have been resolved by now.
I am using a 2019 Dell U2419H and have been using it for 2 months now. It's absolutely fine - I can use it for hours and hours without any issues. I am using my 2010 PC, though. This monitor is PWM-free and TUV-certified for low blue light, so I still had to carefully consider my monitor choice, but it isn't out of anybody's price bracket, and I'm using it fine. I do understand there is an even smaller minority with sensitivites to LED, but CCFL-tech is still perfectly usable for the next decade or so.
Another post has info on heterophoria, but that accounts for 10% of the population so I would have expected more people to be complaining about new technology.
I've had an MRI in the last 5 years, it came back normal. I've had many eye checks due to other issues and the eyes are healthy, and my prescription hasn't changed since I was a teenager.
Anyways, taking action; It's process of elimination to find the point of failure. I think Linux is a good place to start. Somebody out there has to know the kernel inside-out and will be able to explain how Linux DE's render the desktop, what type of dithering/artifacts are used, and how they can be removed. Linux is transparent, so a dithering/artifact free Linux could be a reality tomorrow, if we find the right programmer. If we then test and can all certify that we can run the 'fixed' distro on our good HW without issue, then we test on the bad HW. If the known good Linux distro on our good setup then has strain on the bad, then we're looking at a VBIOS issue or something baked into the HDMI output of new tech. Surely it's as simple as that, either the Driver/OS or VBIOS is producing this painful output. VBIOS could be the tricky issue - however if we KNOW we are running software which does not produce extra artifacts, we have eliminated any other root cause and can pressurise Nvidia et al with these findings.
diop Somebody out there has to know the kernel inside-out and will be able to explain how Linux DE's render the desktop, what type of dithering/artifacts are used, and how they can be removed. Linux is transparent, so a dithering/artifact free Linux could be a reality tomorrow
Well I've have a lossless capture card that I purchased earlier this year out of my own pocket because it was a limited time deal. The issue is I need to have a spare desktop to use with the "image comparator" software I've been working on to detect GPU artifacts, but sadly I don't have any spare desktops lying around.
I've had some discussions with others on what it would take to send me a new desktop, but it hasn't worked out due to either shipping costs or "personal lives" getting in the way.
kammerer Did you ever get tested for heterophoria? Neck and jaw pain is a result of heterophoria, not the other way around. See my new post about a new article. Eyes get tired, pain transfers through trigeminal nerve and affects everything connected. I bet if you get proper optometrist exam youll find heterophoria.
diop 10% have it, but to what degree? The degree I have is less than 1%. Theres a spectrum of how much the eyes are deviated. So I believe we all here are a very rare extreme case of heterophoria, and also only the ones who try to find online solutions in english langauge and work with computers. Real life numbers are larger and time will show I think.
martin 10% have it, but to what degree? The degree I have is less than 1%. Theres a spectrum of how much the eyes are deviated. So I believe we all here are a very rare extreme case of heterophoria, and also only the ones who try to find online solutions in english langauge and work with computers. Real life numbers are larger and time will show I think.
It would make sense why such a minority are complaining about modern tech. Maybe we could post on /r/strabismus as there are about 1000 users subscribed there, we could very quickly determine if other people with strabismus are facing the same problems.
I'm inclined to agree that this is more and more looking like a 'lack of true binocular fusion' barrier of entry > from what I've read in most cases of heterophoria fusion is still happening 99% of the time but it can be broken e.g. with the cross cover test - so essentially this tech is doing the same thing? I have strabismus (heterotropia) which effectively is worse than heterophoria as no fusion is made at any point.
tfouto What OS do you use?
I'm back to using Windows 7 Home Premium on my old Acer 2010 desktop. For me right now it seems any Intel integrated desktop 2010 and previous is good - obviously need to use Windows 7 and older drivers - I'm using Feb 2010 drivers!
There must be some stereoscopic behaviour in modern devices - the question is - why?
I've been posting on Reddit to seek more opinions and also to promote this forum.
I got a reply from a post on /r/optometry, presumably from a working professional.
I've yet to find any compelling evidence LED is any worse for people's eyes than any other light source. Just anecdotes that don't hold up to even mild testing.
The more likely culprit is improvement in visual Fidelity with technology. Video games and movies are viewed on larger higher resolution screens, with high refresh rate. Modern video games are capable of reproducing visual cues like lighting and fog that make them more realistic and immersive than ever. Even cartoon style games are buttery smooth and immersive. Even the modern UI has elements of depth and realism to them. Smart phone screens use AMOLED displays which create more immersive and true colors, especially with darker tones in images.
This is great and the desired effect, but for some individuals the virtual reality of the device crashes with the reality of what going on around you and suddenly your brain is processing conflicting information about it's place in the world. This is disturbing enough for some individuals to cause nausea.
What I will agree with, and know anecdotally, is that I do get motion sickness with certain video games (but never get travel sick). For those of you old enough to remember the original Wolfenstein 3D, that game made me go crazy after 2 minutes, and that is a 30 year old DOS game. So it would be trivial nowadays for that type of effect to be baked into a UI to make it appear more 3D - maybe we're all getting 'motion-sickness lite' as a result of the new stereoscopic effects?
diop For those of you old enough to remember the original Wolfenstein 3D, that game made me go crazy after 2 minutes, and that is a 30 year old DOS game. So it would be trivial nowadays for that type of effect to be baked into a UI to make it appear more 3D - maybe we're all getting 'motion-sickness lite' as a result of the new stereoscopic effects?
As I remember, I got nausea like symptoms after playing Quake (II or III) game. But seems I got them after an hour of game.
Curios article about sickness from games: https://www.theguardian.com/lifeandstyle/2011/dec/19/video-games-makes-me-sick
kammerer Interesting article, and still relevant.
I never really got into FPS games after Goldeneye due to the ill-effects I had. The last time I tried a CoD game was 2008 I believe, and couldn't get past the training mission (less than 30 minutes). Fifa, Mario Kart are fine, it's just FPS games that I can't play.
So can anybody on this forum play FPS games such as CoD for extended periods without issues?
I would say the symptoms I get now from modern tech is either heavy eyes or similar to a very mild form of motion sickness. (Suggesting there is extra motion on the screens to process e.g. dithering).
I use to get motion sick from most FPS games including the first DOOM, but I can play the recently released Call of Duty Mobile (MP Mode) all day long without any issues.
In general, the narrower the tunnels in a game and the faster I move, the more motion sick I get.
Racing games are less of a problem, I guess due to the wide area and less harsh camera movements.
I have filed a bug on Ubuntu Launchpad to try and disable dithering, it's not been assigned to anybody yet but I will keep checking back for updates.
I've also been posting on some subreddits (/r/strabismus, r/optometry) to try and get advice from others, unfortunately there has not been a big response.
I'm not giving up, but (part rant) I'm fed up of this BS. I haven't been employed for quite some time now, why would I want to work on a computer all day if I'm 'stuck' on my good machine which is over a decade old? My lovely new NUC is sitting here which would be a real aid to my personal life (music production/graphic design) and also as a media player/games console. Even on good devices, it seems that app by app or update by update, things are breaking and becoming unusable fast.
I'm not trying to sound pessimistic, but we can't live on years old software and hardware forever. I understand that everybody has their own lives but how do you all work right now? Don't you get really bad symptoms?
If I went to a doctor today and said "I can't work in an office anymore because of XYZ causing me eye strain" - as it's not even a medically recognised condition, I get no support from the state (I'm based in the UK). I could get a job away from technology (a cleaner, teacher, shop assisstant) however I shouldn't have to resort to this much of a change, just because of a software update.
I may be a part of a minority, and I'm very grateful that we can share our experiences, but these companies have so much to answer for if updates are causing individuals like myself to have these sort of reactions.
diop The issue here is that we are a minority that has severe symptoms. I know now that more people are impacted however their symptoms are less severe so they don't look into it.
I am discussing with legal in the company i work for of actions we could take and one of them proposed that if 500-1000 people can come together we could complain to the EU. There is a formal process for this, however all my legal contacts were not optimistic that something immediate could result given the EU's red tape, however we could give it s shot. This site could assist gathering a significant number of people residing in the EU. It would be a good practice to become noticed. If there is a person(s) in this forum with legal background he/she could assist in forming such a complaint in a more formal way.
Regarding the option of building a legal case, all my legal contacts (UK and Greece) told me that it seems difficult due to the uniqueness of the symptoms to each person. Oled displays seem to be the worst for me, others in this forum can use them. The subject matter seems vague from a legal perspective. Then it is a matter of jurisdiction, which will be the legal framework? An EU country, UK, USA?
Peter In the UK there is what is known as the 'Small Claims Court' in which an individual can take legal action against a company. However in order for any legal proceedings to be effective, it would have to be categorically proven that the discomfort came from 'exhibit A'. Herein is the issue; almost every tech company are now using these rendering/display techniques. So it's not just 'the people vs Apple' or 'vs Intel', it's the offending output that's the issue.
Peter Regarding the option of building a legal case, all my legal contacts (UK and Greece) told me that it seems difficult due to the uniqueness of the symptoms to each person.
This is the problem, and it is frustrating as everybody here must feel as if they're on their own with their unique set of symptoms. One thing we all have in common here is none of us can find anything modern that works. Do all modern devices use dithering/pixel movement that is different to a decade ago? I think so, it's just a matter of proving that with hard evidence. What I hope to gain re: Ubuntu/Linux (if we can get that far) is a proof-of-concept to see what happens when running a dithering-free OS/Driver for extended hours. Do our symptoms go away? Are they 50% better? We don't know for certain until everything from OS>Driver>GPU>Display is accounted for.
diop I'm not trying to sound pessimistic, but we can't live on years old software and hardware forever. I understand that everybody has their own lives but how do you all work right now? Don't you get really bad symptoms?
If I went to a doctor today and said "I can't work in an office anymore because of XYZ causing me eye strain" - as it's not even a medically recognised condition, I get no support from the state (I'm based in the UK). I could get a job away from technology (a cleaner, teacher, shop assisstant) however I shouldn't have to resort to this much of a change, just because of a software update.
I don't have much to add to this thread and I don't have much to offer as I am not very knowledgeable about tech like most of you are but I just wanted to chime in on this comment. Many of you started having issues around 2012 and my issues started only in April 2019...so whatever is wrong with the tech, I was able to tolerate until this year. It's comforting to know I am not alone but also highly distressing knowing that there are other people in the same situation I am in who have not found solutions. I also wonder how each of you get by in life, as you mentioned. I am barely hanging on. I have more specialist/doctor appointments lined up but I am not expecting them to be of much help. At work, my boss is trying to help me as much as he can but there are only so many accommodations that can be made before I have to change my career. My productivity is declining and I leave work in tears sometimes because it hurts so bad. I have to take many breaks and turn my monitor off for a while. I no longer watch TV and I don't communicate with people as much. I have a masters degree and a great deal of student loan debt...it's disheartening to think I may have to completely change my career and worry about how I will make money due to this issue. I would never want to have to claim disability but you're right--this would not even be recognized as a disability if we needed to file and there are very few higher paying jobs out there that don't rely on tech.
All that said, I am 100 percent supportive of any calculated action that will inform others of our condition and hopefully aid in SOME kind of help, research, or solution. If there is anything I can do, I will be glad to help.
diop how do you all work right now? Don't you get really bad symptoms?
It has been awful for the past year and a half. I go home torn and I start in the morning pretty much in the same condition. My productivity has become very low in spite of all my efforts. I thought of changing job, but not understanding what causes my issues, and the sensitivity I developed to modern lighting in addition to electronic devices make me hesitate. I am completely lost. I have not received any real support nor inspiration from any of the medical doctors and practitioners I visited. Recently I have been spending a fortune on massages of various type and acupuncture to release neck and face tension, but if there is any relief it is only temporary. Exposure to a bad device restore the bad symptoms within few minutes. The worst is probably eyelids and facial spasms.
I have not been yet to an orthoptics specialist, because here that science is considered "folk remedy". I will try my luck in Europe during the Christmas break. That is my last option.
My problems date before 2012. I used my ThinkPad T60 on XP from 2010 to 2017 day and night no problem, but from 2007 to 2010 I kept it in a drawer thinking it was unusable. What made the difference was changing the display resolution from native to 1024x768. Same hardware, same software. Not a matter of font size, because I ran that test too. At native resolution my eyes fried, at lower resolution I could work for hours without a break.
My eyes or brain are not able to cope with external optical impulses the vast majority of people are fine with. That is my only conclusion.
With regard to making our discomfort more public, maybe one option is to support already existing campaigns, e.g., https://lightaware.org/about/individual-stories/.
That is what one of the ambassador wrote me months ago: "We are collecting case studies of the way that lighting affects different people to show how many people are affected and how it impacts on their daily lives. The idea is that if we can collect hundreds of similar stories then we can show medics and researchers that this is a problem which is affecting a diverse range of people, and also potentially provide case studies for researchers."
In my opinion there are many people affected. We are not just a few hundreds. We should try to have sufferers converge on one or few platforms rather than spreading them out?
Looking at the Intel support forums (Windows), there's lots of reports of incorrect color depth being detected for displays, and users complaining they can't select 10/12 bit color. Also a few requests to disable dithering.
Our driver supports Color Depths of 8-bit or 12-bit via HDMI*.
Currently, if a 10-bit display is used the driver will default to 8-bit with Dithering or 12-bit if supported.
Please refer to Deep Color Support of Intel Graphics White Paper. (Page 11)
There is already a request to allow users to manually select desired Color Depth via IGCC (Intel® Graphics Command Center), but this is a work in progress with no ETA however it is in our Top priority list.
The above doesn't impact at all the Encoding/Decoding capabilities of the graphics controller. HEVC 10-bit video encoding/decoding via Hardware is supported by the graphics controller.
There was another post I saw, and forget the link, but the support answer was that the driver automatically enables dithering for 8Bit and above, and disables for 6bit. Which seems backwards, as a poster pointed out. Is there a way to spoof or rewrite an EDID to be detected as 6bit?
Intel Deep Color White Paper - https://www.intel.com/content/dam/www/public/us/en/documents/white-papers/deep-color-support-graphics-paper.pdf
Most of the traditional laptop panels were internally of 6bpc (=18bpp panels). This naturally means even a normal 24bpp image can show dithering when viewed on an 18bpp panel. To avoid this, a process called “dithering” is applied which is almost like introducing a small noise to adjoining pixel. This will create variations in 18bit representation and results in hiding color banding on such panels. Either the GPU’s display HW or panel itself might do this. When a panel does this, source (GPU display HW) is not aware of the same and panel will advertise itself as a normal 8bpc (24bit) panel
So I gather from this that the reason dithering is always enabled is a failsafe to avoid a poor 6bit+frc implementation. Which in a way makes sense because there could be very cheap monitors out there that are advertised as 8bit but are 6bit+frc (as we know), so the dithering at the GPU side is ensuring a consistent 8bit output across all monitors.
Windows* OS doesn’t support more than 8 bpc for desktop. This means even if an application has a more than 8 bpc content, it will all be compressed to 8 bpc during desktop window composition process as shown in figure below
So perhaps all these applications that are starting to cause strain are designed with HDR in mind, however they are being dithered down to 8bpc?
In many respects I can understand why dithering is enabled. It's much easier to just force all displays to 8-bit rather than have the possibility of an incorrectly detected monitor/color combo. However I think that advanced controls should still be available to the consumer as dithering on a true 8-bit monitor will produce extra noise when it isn't needed. So I'm not getting what I pay for as a consumer. So eye strain or not, dithering isn't needed if the monitor is detected correctly and correct color range is selected.