I hope I'm not repeating information others have already posted on, but are any of you aware of the concept of the Flicker Fusion Threshold? Whilst I believe that flicker at some level (whether FRC, dither, PWM etc) is behind my eye strain, I was interested to read about flicker fusion threshold. In particular, there's a really interesting piece of research here that says basically that whereas in the old days, with multiple simple images being used to create in the human brain the illusion of movement the display refresh rate could be relatively low, now, with all the different things going on with modern display technology, it needs to be that much higher for flicker not to be visible.

Anyway, my reading of this and related information around the web is that possibly we ledstrain sufferers have a higher flicker fusion threshold than other people. In that context, the remedies might include: higher refresh rates (has anyone tried 240hz set up?) instead of or as well as set ups that don't use stuff like FRC. I wonder too if there's any medical 'fix' or correction for the eyes of people with a higher flicker fusion threshold. But I can't find anything on this.

Anyway, I hope this isn't old ground or totally irrelevant, but I found it interesting so I thought I'd share it.

I believe this all comes down to flickering/flashing.

Back in the days of CRT's, I was able to get by using 75hz monitors. Then later 120hz CRT's came out.

With LCD's I had issues until they started making the gaming versions which were 120hz, which at the time were very expensive. Since then I have mostly used gaming laptops with 120hz screens and until recently could find one that worked.

But now it seems that even 120hz no longer works for me. So as you said, I may try 240hz and see if that works.

But overall, yes I do believe it is a flicker issue in all cases. Another piece of evidence that supports this (at least to me) is that I can use the same Xbox console with the same TV and have no issue with one game, yet another game will give instant eyestrain, even at just the menu screen. Obviously, the way those games are being refreshed is the issue as the hardware is exactly the same.

Also, I will sometimes have a computer setup that works for me, but certain streaming videos will cause instant eyestrain even though other activities are fine on that same setup. Streaming videos use tricks with the framerate, so somehow a certain framerate must "gang up" with whatever setup I have to reach some irritation causing frequency of flicker.

So basically, every element of your setup, from the OS, to the monitor, to the graphics card, to the content on screen is causing some amount of flicker that all adds up. The sum total of this flicker detected by your eyes either falls into the non-irritation zone or the irritation zone. Changing one thing can sometimes be enough to change the total flicker and remove the irritation. The flipside of that is one small change to any one of those, such as a driver update, can make it go in the other direction and cause a good setup to now have issues.

That's why I seem to have decided to just go after screen refresh rates as a solution as that's the easiest for me to control and seems to have the most effect.

    screenjunky I agree it's down to flickering/flashing, or rather the way it is flickering/flashing.

    As the article in the first post states (really interesting read OP), all screens in some way flicker/refresh. CRTs AFAIK drew the line from top to bottom of the complete image and then redrew the next image (granted there is a difference between interlaced and progressive scan). I can't remember perceiving any flicker as a kid with any CRT, or strobe effects, or dizziness at all. Many of us will remember staring into white noise before switching on a games console 🙂.
    Of course all video output was analog in those days.

    The article alludes to the fact that modern display tech doesn't display the image 'naturally' like CRT's/older tech, but rather 'zones'/'coded areas' are used to suggest a full image. So we're not seeing a complete image at once like older technology, but our brains are expected to complete the picture and perceive a full image.

    It also described a scenario in which this technology (dithering?) could be used to display certain frames to a normal user and completely different ones to a user wearing special glasses. Perhaps in the future AR-like tech could be applied to computer screens, changing a switch on your 'glasses' changes which frames of the display output you pick up on.

    This would mean essentially with new tech we are now solving a magic eye/demanding our brains complete a puzzle as opposed to simply receiving a natural image at a given refresh rate.

    Hopefully a true dither-free fix will be with us soon.

    15 days later

    screenjunky You mentioned that monitors with 120hz previously worked for you. What about when it comes from hardware? I have been using the same ASUS LCD monitors for 6 or 7 years and they work fine with my older windows 7 laptop but cause eye strain when using my windows 10 desktop. The main source of my eye strain has always been from hardware and not monitors.

    • AGI likes this.
    7 months later

    My new iPad Pro 12.9 is really comfortable to use for gaming, where the screen is set to 120 Hz refresh rate, and not so nice - ie causes discomfort- for apps where the refresh rate is set much lower, such as word processing or watching video. To me, this supports the theories and discussions above, and indicates that for me, higher refresh rates offer the best hope for addressing the symptoms I experience. Higher refresh rates seem to be the way display tech is going anyway, but the challenge is to be able to set a minimum rate, as manufacturers tend to use variable rates in order to extend battery life.

    I feel if there was a way to get the ‘Promotion’ variable refresh rate control on the iPad to display only at 120 Hz for everything, I could use the device happily for hours. As it stands, I can use it for a couple of hours each day, probably because the refresh rate is up to 120 Hz at times, which is much better than my MBP, for example, but can’t use it for work or anything that requires me to be eyestrain free and able to concentrate. I will probably return the iPad for this reason (and because I resent paying £1000 for something I can’t use as I should be able to). I will concentrate on trying other devices, like gaming laptops, where it’s more possible to set higher refresh rates to operate all the time, but if Apple added a ‘set to 120 Hz as default’ option, this could, I believe, be a game changer for the usability of Apple products.

      I tried to force my Macs to run 60hz/fps all the time by overlay a 60fps video but I didn't see any improvement. you can measure the OS frame rate with the by downloading the quartz debug tool. i tested also an old 2009 macbook yesterday that I used for years and it has a super flickering in slow mo, never had a problem with it. I'm totally lost at this point, color dithering? Pwm frequency? faster response rate? hdpi?

      3 years later

      I came across an interesting paper around Flicker Fusion Threshold. I suspect ours is high which is "bad" because we notice flicker that "normal" people do not. The paper does mention that getting older reduces the threshold -- so maybe that will be a good thing for us.

      dev