@f3likx now that some time passed (2 months) could you please give us an update on how your eyes are doing? Would be interesting to know if you still have the same good feeling as you experienced before. Thanks

So, I typically run the monitor at 144 or 165hz in "reading mode" (which adds an anti-blue filter), unless I'm playing a fast-paced shooter or watching a movie with nice colors - I switch to regular color mode for those. I do use low brightness settings since my room is often somewhat darkened (I have Hue lights overhead), but that doesn't impact the image quality on this display like it does on the TN's I've owned. I typically turn on the built-in backwards-facing ambient light if it's night time.

After 2 months, my experience hasn't changed: I can still use it for many hours a day, even on top of the 8 I spend at work looking at a couple 1080p 60hz IPS panels. The eyestrain level I experience with it is equivalent to the lowest of any display device I own or come across at work, with the exception of an e-ink display.

For me, it is the same eyestrain level as a CCFL backlit display.

    f3likx

    Can you tell me your set up outside of the monitor? ie GPU, driver version and OS? (also what previous gpu's have worked for you)

    Thanks!

      Soreeyes

      GPU is a 1080ti, OS is windows 10 1803, driver 382.53. I also have a windows 7 dual-boot, which didn't have any impact on eye-strain for me (I would use windows 7 for lower input lag and better MIDI support in classic Doom)

      I haven't had a gpu "work" for me after or including the 780ti, which I was running with both the Asus VG278H and later the PG278Q (which is when the eystrain issues became a hugely noticeable problem for me). I went from the 780ti to a 980, then a 980ti, then the 1080ti with the PG278Q, then PG278QR, and now this monitor.

      • hpst replied to this.

        Is this the first report here of a 10xx series Nvidia GPU working fine for someone?

        @f3likx Was the 1080Ti OK on your previous screen(s) or not?

        I'm curious because I've had no joy on anything newer than a late model 970, including 980, 980Ti, 1070, 1080 with my known good Dell 2407 CCFL monitor.

          AgentX20 I have a self built 1050Ti setup with an Eizo flexscan ev2336w and its good for all day use.
          just for info my eye profile background is sensitivity to led backlight pwm and after 2013 macbooks I cant use. So eizo and windows (auto updates) have been good for me so far.

          a month later

          f3likx You still around? Can you update us on the display and if everything is still good? Have you tried it with different hardware?

          Everything is still good, but I do use the "reading" mode on the panel now, and I continue to wear my anti-blue glasses as well... the main cause for this is the brightness is too high in all the other modes. I can still see all details in dark scenes at 0 percent brightness. Using the monitor without anti-blue glasses is doable, but I save that for days I haven't been staring at screens for 8 hours at work already.

          As for different hardware, I have some IPS Dell P2312H panels at work which annoy me (but I use anti-blue glasses there), and I still use an ASUS VG278H 120hz TN panel at home which annoys me significantly more than the 32GK850G (which now has a cheaper freesync version, if I'm correct). I have a Pixel 2 which causes eyestrain, but with the "night mode" enabled, it seems to have almost zero eyestrain. I also have an Nvidia Shield tablet, which causes zero eyestrain in any modes. It is a bit bright at minimum, but that's separate from my main issue.

          I haven't gone over to the 2080ti due to the skyrocketing failure rates of any Micron-equipped, standard PCB versions... which is to say, most of them. When I do, I'll update this - but I really don't expect to experience any difference in eye strain.

          It took a while to get used to seeing the whites of my eyes in the mirror (instead of yellow or red!)

            f3likx I have a Pixel 2 which causes eyestrain, but with the "night mode" enabled, it seems to have almost zero eyestrain.

            This caught my attention. Does anyone know why this would be? Those utilities aren't reducing bluelight frequencies getting to your eyes really, just tinting things, so even if you were hypersensitive to blue light it shouldn't make such a difference. I wonder if "night mode" changes some other aspect like dithering or something else?

            Interesting the Shield is fine...what Android version is it running?

            It IS very frustrating how so many displays are so bright, some even at the lowest possible setting, and how people seem to laud massive bright displays on review sites. They act like anything under 300nits is simply unsuable but I never go above the lowest two steps on any backlight.

            • KM replied to this.

              KM I have found that newer iPhones that are OLED/AMOLED burn my eyes, get them bloodshot and then water. I got the new iPhone X and that was it. Even looking at Galaxy Phones, and the Pixel phones do the same as well.

              The older iPhones - which are not OLED, but have IPS panels are fine.

              Then on the computer side - No pain on any PC except the 4K machines. Macbook - All new macbooks get my eyes bloodshot. I have no clue.

              • KM replied to this.

                f3likx I haven't gone over to the 2080ti due to the skyrocketing failure rates of any Micron-equipped, standard PCB versions... which is to say, most of them. When I do, I'll update this - but I really don't expect to experience any difference in eye strain.

                Keep us posted. Still holding out some hope that the 20 series might be different.

                (Much in the same way I'd like to try out an AMD Vega 56 or 64...)

                  AgentX20 If you are not a gamer I would rather go for an NVidia Quadro or an AMD Firepro graphics card. Seems these workstation cards are more comfy on the eyes (Quadro does not seem to have dithering enabled and the AMD card can turn off dithering with the "enable 10-bit color" switch).

                    deepflame enabling 10 bit on the amd turns off dithering? do you need a true 10 bit monitor?

                      Cougarfalcon83 This sounds familiar. I get burning, red eyes from my AMOLED phone when it's in PWM mode, meaning it is flickering at 240 Hz. Above a certain brightness value PWM turns off. Maybe on iPhone X devices this is possible, too. Most Samsung AMOLEDs are known to flicker even at full brightness.
                      But that's only half of the story. The software (apps) and OS (ROM) can contribute to eye strain, too, as described in detail inside the "Usable Smartphones" thread.

                      hpst This caught my attention. Does anyone know why this would be? Those utilities aren't reducing bluelight frequencies getting to your eyes really, just tinting things, so even if you were hypersensitive to blue light it shouldn't make such a difference.

                      I have an idea about this: Pixel 2 is AMOLED. That technology has no backlight but many small red, blue, and green LEDs as subpixels. So when you turn down blue, what happens is that the blue LEDs go down in brightness, resulting in less blue light.
                      There are apps that can display only 1 color in fullscreen (I use "Bad Pixels" from F-Droid for this). When I set it to pure blue and then put on my $10 blue-light-blocking glasses, the display, although emitting a bright blue, appears as if it was turned off.
                      However, judging by what I have read over time, even backlit devices should reduce blue light successfully by, as you said, tinting the colors with their TFT matrix. The small difference is that a minor amount of white backlight light, including blue, still leaves the panel in between the subpixels - this is why black screens still glow a little.
                      To reveal the true spectrum you can look at the light with a cheap spectrometer. It will tell you how the colors are distributed, and roughly how much blue light there really is. When you use it to look at a white AMOLED screen, you only see 3 stripes of blue, green, and red. The rest of the spectrum is completely missing.

                      @f3likx must be lucky that his hardware/software setup apparently doesn't introduce eye strain when tinting the screen. I can't use the night mode on my device without making it unusable.

                      • hpst replied to this.

                        Link Well, it turns off dithering for applications that want to work in 10bit color depth.

                        In other words: Currently I run Windows 7 with an ATI FireGL card (9 years old) that does not have dithering enabled for most apps. Now when I use apps that request 10-bit color depth like Capture One Pro (photography app) or Opera the card starts to dither heavily in order to display the 10-bit color depth that the 8-bit display would not be able to show.

                        Enabling 10-bit color support tells the driver that I use a 10-bit display and turns off dithering. Interestingly this also works without a 10-bit screen.

                        So I am currently very happy that I can use some of the apps now that I have not been able to use because of said dithering in 10-bit mode.

                        I assume this is the same effect that others experience playing video games. Some games request a higher color depth and the video card resorts to dithering.
                        Now that the industry's next big thing is HDR and high contrast with flashy vivid (and may I say mostly unnatural) colors I think it makes sense that manufacturers use methods like dithering to stay competitive.

                        I hope that the industry either catches up with affordable 10-bit displays that do not dither or that our voice gets heard more and that devices like an E-ink laptop that Sony supposedly works on get bigger R&D budgets and get released sooner.

                        Sometimes I cannot believe that we are such a minority that cannot use new tech. I would assume that more people are affected already and do not realize yet what the cause of stress, headache, eye strain or bad sleep is.

                        Also, does no employee at Apple, Microsoft, Sony, etc. have these issues with their own tech or are they so blinded by the next brighter, more colorful display to wow the world and to stay ahead of the competition that topics like ergonomics and health get forgotten?

                          deepflame Enabling 10-bit color support tells the driver that I use a 10-bit display and turns off dithering. Interestingly this also works without a 10-bit screen.

                          So if tricking the app into thinking you have a 10-bit display keeps it from dithering within the app, I wonder if we could do this system wide? @JTL is this something you could use or alraedy are? Is there a way to trick the entire system/gpu/whatever does that request for system wide dithering into believing we have a 10-bit or greater display and this avoiding it needing to turn on dithering for our actual 6 or 8 bit panels? Using a "quality" greater than its request would be better if it would work because you could tell it something wild like "this is a 128bit display" etc it would cover us long term rather than chasing 10-bits then the next thing on and on.

                          KM I have an idea about this: Pixel 2 is AMOLED.

                          I think I follow what you have written but am missing how that explains that the night mode tinting causes strain on the Amoled.

                          deepflame Someone on the Nvidia forums claimed that the Quadro cards of the same chipset generation as the 10xx (Pascal) had dithering enabled forcefully.

                          • Link replied to this.

                            deepflame the thing is in the past screens were 6 bit and dithered to get up to 8 bit. Dithering isn’t a new technique so I’m reallly not sure if we are looking in the wrong place. Also dithering is such a faint flicker especially compared to say harsh led PWM flicker so things don’t add up when someone is sensitive to dither but completely fine with PWM.

                            Also very few games up to this point are 10 bit. Maybe a handful. And if content like video games or movies are encoded in 10 bit I’m not sure they can request a panel or gpu to dither.

                              JTL there are people on forums complaining about 10xx series nvidia cards not having dithering enabled and them having problems with banding due to it. So a lot of mixed info out there not sure how to test.

                              If running Linux you can turn off dither in nvidia 10xx series cards right?

                              dev