f3likx You still around? Can you update us on the display and if everything is still good? Have you tried it with different hardware?

Everything is still good, but I do use the "reading" mode on the panel now, and I continue to wear my anti-blue glasses as well... the main cause for this is the brightness is too high in all the other modes. I can still see all details in dark scenes at 0 percent brightness. Using the monitor without anti-blue glasses is doable, but I save that for days I haven't been staring at screens for 8 hours at work already.

As for different hardware, I have some IPS Dell P2312H panels at work which annoy me (but I use anti-blue glasses there), and I still use an ASUS VG278H 120hz TN panel at home which annoys me significantly more than the 32GK850G (which now has a cheaper freesync version, if I'm correct). I have a Pixel 2 which causes eyestrain, but with the "night mode" enabled, it seems to have almost zero eyestrain. I also have an Nvidia Shield tablet, which causes zero eyestrain in any modes. It is a bit bright at minimum, but that's separate from my main issue.

I haven't gone over to the 2080ti due to the skyrocketing failure rates of any Micron-equipped, standard PCB versions... which is to say, most of them. When I do, I'll update this - but I really don't expect to experience any difference in eye strain.

It took a while to get used to seeing the whites of my eyes in the mirror (instead of yellow or red!)

    f3likx I have a Pixel 2 which causes eyestrain, but with the "night mode" enabled, it seems to have almost zero eyestrain.

    This caught my attention. Does anyone know why this would be? Those utilities aren't reducing bluelight frequencies getting to your eyes really, just tinting things, so even if you were hypersensitive to blue light it shouldn't make such a difference. I wonder if "night mode" changes some other aspect like dithering or something else?

    Interesting the Shield is fine...what Android version is it running?

    It IS very frustrating how so many displays are so bright, some even at the lowest possible setting, and how people seem to laud massive bright displays on review sites. They act like anything under 300nits is simply unsuable but I never go above the lowest two steps on any backlight.

    • KM replied to this.

      KM I have found that newer iPhones that are OLED/AMOLED burn my eyes, get them bloodshot and then water. I got the new iPhone X and that was it. Even looking at Galaxy Phones, and the Pixel phones do the same as well.

      The older iPhones - which are not OLED, but have IPS panels are fine.

      Then on the computer side - No pain on any PC except the 4K machines. Macbook - All new macbooks get my eyes bloodshot. I have no clue.

      • KM replied to this.

        f3likx I haven't gone over to the 2080ti due to the skyrocketing failure rates of any Micron-equipped, standard PCB versions... which is to say, most of them. When I do, I'll update this - but I really don't expect to experience any difference in eye strain.

        Keep us posted. Still holding out some hope that the 20 series might be different.

        (Much in the same way I'd like to try out an AMD Vega 56 or 64...)

          AgentX20 If you are not a gamer I would rather go for an NVidia Quadro or an AMD Firepro graphics card. Seems these workstation cards are more comfy on the eyes (Quadro does not seem to have dithering enabled and the AMD card can turn off dithering with the "enable 10-bit color" switch).

            deepflame enabling 10 bit on the amd turns off dithering? do you need a true 10 bit monitor?

              Cougarfalcon83 This sounds familiar. I get burning, red eyes from my AMOLED phone when it's in PWM mode, meaning it is flickering at 240 Hz. Above a certain brightness value PWM turns off. Maybe on iPhone X devices this is possible, too. Most Samsung AMOLEDs are known to flicker even at full brightness.
              But that's only half of the story. The software (apps) and OS (ROM) can contribute to eye strain, too, as described in detail inside the "Usable Smartphones" thread.

              hpst This caught my attention. Does anyone know why this would be? Those utilities aren't reducing bluelight frequencies getting to your eyes really, just tinting things, so even if you were hypersensitive to blue light it shouldn't make such a difference.

              I have an idea about this: Pixel 2 is AMOLED. That technology has no backlight but many small red, blue, and green LEDs as subpixels. So when you turn down blue, what happens is that the blue LEDs go down in brightness, resulting in less blue light.
              There are apps that can display only 1 color in fullscreen (I use "Bad Pixels" from F-Droid for this). When I set it to pure blue and then put on my $10 blue-light-blocking glasses, the display, although emitting a bright blue, appears as if it was turned off.
              However, judging by what I have read over time, even backlit devices should reduce blue light successfully by, as you said, tinting the colors with their TFT matrix. The small difference is that a minor amount of white backlight light, including blue, still leaves the panel in between the subpixels - this is why black screens still glow a little.
              To reveal the true spectrum you can look at the light with a cheap spectrometer. It will tell you how the colors are distributed, and roughly how much blue light there really is. When you use it to look at a white AMOLED screen, you only see 3 stripes of blue, green, and red. The rest of the spectrum is completely missing.

              @f3likx must be lucky that his hardware/software setup apparently doesn't introduce eye strain when tinting the screen. I can't use the night mode on my device without making it unusable.

              • hpst replied to this.

                Link Well, it turns off dithering for applications that want to work in 10bit color depth.

                In other words: Currently I run Windows 7 with an ATI FireGL card (9 years old) that does not have dithering enabled for most apps. Now when I use apps that request 10-bit color depth like Capture One Pro (photography app) or Opera the card starts to dither heavily in order to display the 10-bit color depth that the 8-bit display would not be able to show.

                Enabling 10-bit color support tells the driver that I use a 10-bit display and turns off dithering. Interestingly this also works without a 10-bit screen.

                So I am currently very happy that I can use some of the apps now that I have not been able to use because of said dithering in 10-bit mode.

                I assume this is the same effect that others experience playing video games. Some games request a higher color depth and the video card resorts to dithering.
                Now that the industry's next big thing is HDR and high contrast with flashy vivid (and may I say mostly unnatural) colors I think it makes sense that manufacturers use methods like dithering to stay competitive.

                I hope that the industry either catches up with affordable 10-bit displays that do not dither or that our voice gets heard more and that devices like an E-ink laptop that Sony supposedly works on get bigger R&D budgets and get released sooner.

                Sometimes I cannot believe that we are such a minority that cannot use new tech. I would assume that more people are affected already and do not realize yet what the cause of stress, headache, eye strain or bad sleep is.

                Also, does no employee at Apple, Microsoft, Sony, etc. have these issues with their own tech or are they so blinded by the next brighter, more colorful display to wow the world and to stay ahead of the competition that topics like ergonomics and health get forgotten?

                  deepflame Enabling 10-bit color support tells the driver that I use a 10-bit display and turns off dithering. Interestingly this also works without a 10-bit screen.

                  So if tricking the app into thinking you have a 10-bit display keeps it from dithering within the app, I wonder if we could do this system wide? @JTL is this something you could use or alraedy are? Is there a way to trick the entire system/gpu/whatever does that request for system wide dithering into believing we have a 10-bit or greater display and this avoiding it needing to turn on dithering for our actual 6 or 8 bit panels? Using a "quality" greater than its request would be better if it would work because you could tell it something wild like "this is a 128bit display" etc it would cover us long term rather than chasing 10-bits then the next thing on and on.

                  KM I have an idea about this: Pixel 2 is AMOLED.

                  I think I follow what you have written but am missing how that explains that the night mode tinting causes strain on the Amoled.

                  deepflame Someone on the Nvidia forums claimed that the Quadro cards of the same chipset generation as the 10xx (Pascal) had dithering enabled forcefully.

                  • Link replied to this.

                    deepflame the thing is in the past screens were 6 bit and dithered to get up to 8 bit. Dithering isn’t a new technique so I’m reallly not sure if we are looking in the wrong place. Also dithering is such a faint flicker especially compared to say harsh led PWM flicker so things don’t add up when someone is sensitive to dither but completely fine with PWM.

                    Also very few games up to this point are 10 bit. Maybe a handful. And if content like video games or movies are encoded in 10 bit I’m not sure they can request a panel or gpu to dither.

                      JTL there are people on forums complaining about 10xx series nvidia cards not having dithering enabled and them having problems with banding due to it. So a lot of mixed info out there not sure how to test.

                      If running Linux you can turn off dither in nvidia 10xx series cards right?

                      Link Also dithering is such a faint flicker especially compared to say harsh led PWM flicker so things don’t add up when someone is sensitive to dither but completely fine with PWM.

                      I don't agree with this because PWM is the entire displays backlight turning on and off regularly...dithering is each pixel flickering on its own and creates a lot more overall movement as you can see from the Dasung videos @degen took. Even a static screen is "moving" and wiggling and always in motion. I would say seeing every background, object, letter etc moving constantly and independently via dithering would be more difficult for focusing and strain than strobing of a lightsource. Even incandescent bulbs flicker and nobody is bothered by that. Now I am not saying PWM or strobing lights don't bother anyone...I hate visibly strobing lights...but I do disagree that it's obviously much worse than wiggly dithering.

                      I also don't think the 6/8/10 bit progression disqualifies this theory since we don't know if something similar to @deepflame's experience was happening before where apps/OS's were requesting certain performance or panels were just not recognizing or capable. That's my theory as to ancient CCFL laptops don't tend to bother us...they simply aren't capable of responding to the requests for dithering up to fancier graphics...of course it could also be that the older software people tend to use on those older devices aren't 'asking' either.

                      This will be easy to test once someone can comfirm a way to swtich dithering on and off. It's either going to be a clear relief or not. For me removing PWM was not a clear relief and while I wanted to believe it and tried to push through it, PWM free panels hurt just as much for me.

                      • Link replied to this.

                        hpst but the overall brightness difference is very small. The pixels are shifting between 2 very close colors.

                        • hpst replied to this.

                          Link In my mind it's not about brightness wrt to dithering..its about movement. Watch @degen's videos. You can see movement everywhere because the E-ink panel makes it visible unlike an LCD. If you were reading a book where the white backround and black letters were always moving a little bit and changing shades even a little etc it would be tiring and hard to focus on. I think this is what dithering is like for us but on a much larger scale since even what looks like a solid white background is really a bunch of wiggly points rather than a solid area. Compound that with fonts, images, video etc its constant movement and that's exhausting for the eyes. All that wiggling, movement, changing of colors etc are not obvious to the naked eye but are straining the visual system nonetheless. It's like "seeing" a solid white paper and when zooming in realizing its thousands of maggots crawling all over each other. We may not recognize that movement when zoomed out, but the eye muscles and brain do and are filtering it. We are already filtering out so much when looking at LCDs on a macro scale that this might just be the straw breaking the camel's back.

                          Obviously this is entirely speculation, the dithering IS there but it might not bother us at all...or it might be the root cause and all the other stuff like PWM/eye accomodation etc are just additive factors or triggered by it. People were positive PWM was the root cause as well. But we need to rule this out if nothing else and there isn't any other obvious theory that at present doens't have a definitive way to test.

                          Link Currently I work with this around 9 year old CCFL backlit laptop running Windows 7 and I am pretty sure dithering is a big thing for me. My eyes can tell after some seconds (or even right away) if it is comfy or not.
                          I do not have proof like color banding screenshots or such but I can surely tell that setting the graphics driver to "enable 10bit color" made me able to work with Opera and Capture One Pro again.
                          Otherwise I got a bad eye strain after some minutes like most of us do.

                          Running the current version of Manjaro Linux gives me strain again. So I am pretty sure it is software based in my case. That tells me that it is possible to turn it off in Linux as well. Good thing here is that it is open source and it is possible to modify the source...

                          PWM is its own beast and I found a laptop with CCFL and PWM bad as well. I also own a PixelQi 10,1'' panel that uses PWM and it is not as good as I hoped it would be.

                          Also @Gurm stated in a different thread that Destiny 1 is fine on his Xbox One whereas Destiny 2 is not.
                          ( https://ledstrain.org/d/358-xbox-one/11 ). I am no expert here but as he also thought the graphics card changes its graphics mode and enables dithering. I would suspect it is the color depth...

                          • hpst replied to this.

                            deepflame I am no expert here but as he also thought the graphics card changes its graphics mode and enables dithering. I would suspect it is the color depth...

                            From another non-experts point of view it DOES seem a logical theory. And since we are never going to get every manufacturer and software maker to agree to turn dithering off or give us a switch...it seems tricking things into believing the needs can be met without dithering is the onloy sweeping approach possible. You have a reproducable relief within those apps...now we need to figure out how to get the OS to react like the apps and make this a system wide trick. I don't know if this is a plausible idea or not...tricking things into believing they don't need dithering, vs somehow blocking it.

                              hpst Yeah, I think it makes sense as every manufacturer is about HDR, high contrast, sharpness bla bla... And I think the hardware cannot deliver this completely yet so they need to help with software voodoo like dithering.
                              (yes, blue light and PWM are their own things but I think that we are in the HDR aera now where dithering/FRC plays a big role).

                              • JTL replied to this.
                              • hpst likes this.

                                deepflame I was hoping with all these modern HDR monitors it would finally knock some sense into these companies into producing better panels, but nothings that simple... 🙁

                                • hpst replied to this.
                                  dev