MagnuM No, but I definitely get motion sick when playing 3D games where you move quickly and the path is narrow. Or when I watch related videos. The more open the field, the better it gets.

Edit: To clarify, my quoted comment was not related to this. The question I asked was to understand if 144 Hz may help with eye strain you get when you look at static screen content (e.g. Windows 10, Firefox Quantum, some Android versions...).

deepflame I expected as much since it takes a few days for most new stuff to really show you if it's going to be ok long term. I didn't think refresh rate should help based on personal experience and the way LCD's refresh, but it was worth a shot. Whatever the root cause is for you is still happening.

    hpst Did you find a setup that is working for you for long periods of work in front of a screen yet?

    I have to say that I was diagnosed with nystagmus (uncontrolled eye movement/flickering) and thought that my condition might have to do with that as well. However I can remember that I had streaks of long working hours years ago like most of you had. Or maybe I just get old(er), not sure

      deepflame I have some weird eye stuff as well but it's proven not the root cause of my problem. I wouldn't argue it can make things worse...but there is definitely a technology issue at play here for 99% of people in my opinion as proven by the fact everyone has at least one "safe" thing they can look at...and we don't have the same strains in non screen related life.

      deepflame I have to say that I was diagnosed with nystagmus

      I've had nystagmus since I was born.

        JTL Just a thought, might not be accurate, but maybe since the screen is running at a higher refresh rate the dithering is "running" at a faster rate?

        If that's the case just wait until I get my dithering fixes done. 🙂

        Also, anyone have thoughts on this?

        • andc replied to this.

          JTL I've had nystagmus

          so you mean it is gone now with your "special treatment"?

          • JTL replied to this.

            JTL

            JTL Just a thought, might not be accurate, but maybe since the screen is running at a higher refresh rate the dithering is "running" at a faster rate?

            If that's the case just wait until I get my dithering fixes done. 🙂

            It might be the case - in my tests Ipad Pro (120 Hz) didn't show sings of flicker, unlike 60Hz displays. I guess it will be more likely to be related if dithering happens on hardware side, but who knows 🙂

            9 months later
            deepflame changed the title to 144Hz/240Hz Screens - not just for gaming? .

            @jasonpicard had some very valuable inputs on the "flicker free LED lights" thread ( https://ledstrain.org/d/106-flicker-free-led-lights/109 ) and I thought of rather continuing the discussion about gaming gear/ overdrive functionality here.

            @jasonpicard I understand that this tech may be very interesting for gamers but I wonder how and why it should help people like me who mostly read text and watch some Youtube videos?

            When you told that the LG24GL600F solved all your eyestrain issues that you had for around 11 years I was wondering if the overdrive function smoothens out every flickering there is coming from the computer by (if I understood correctly) predicting the next pixel/color values and interpolating their values to 144Hz?

            Did you also have issues before when looking at a steady Windows desktop where nothing is moving? I have issues now looking at a phone or screen for some seconds and react to it immediately (and there is no active motion like video).

            Does BlurBuster have an updated list of monitors with good overdrive features and is the one you have now "the best" on the current market in this regard?

              deepflame I totally had issues even when reading text on a screen or a webpage or doing anything. Someone just was showing me a video on his crappy phone and it was hurting my eyes. Did you watch the video on eye tracking? https://www.blurbusters.com/faq/oled-motion-blur/ Look at this page and check out the explanation of eye tracking. Every LED/CCFL follows this rule. That is why a higher refresh rate will help. The way I understand it though is it will take to a 1000hz refresh rate to equal a CRT without using tricks like overdrive and black frame insertion, https://www.blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/

              Check out the journey to a 1000hz article. LED's are kind of terrible when you look at these rules they follow. Your brain has to compensate for crazy things happening. In regards to your Youtube question my question would be does anyone on here no what Youtube videos are filmed at? Is it 24 frames or something else. This will create a different problem then reading text on a website. Movie Theatres usually run at 48 or 72 refresh rates so movies will work perfect with them because movies are usually filmed at 24fps. They double scan or triple scan the image and I think this makes it flicker free. Last generation CRT's were capable of this running at 100hz or 120hz and doing a double scan trick making them flicker free.

              deepflame I am not sure about the list but I have spent a lot of time on there but when you are buying a monitor you can look up reviews on rtings.com and they usually tell you what level you can set the overdrive to without creating additional problems. The link I sent to you in the other thread about dither and the almost 10 other different LED problems is essential reading. Even the questions I asked in the other link he says things about screens I have never heard anyone else every mention or say before. Like the prototype LED's 98CRI that use the violet chip that have almost no blue light. Except they are too expensive to make right now. Or for me the issue maybe I just don't like bright screens. Before LED/CCFL all our screens were 2900K. I'm testing a flicker free LED right now at 2700K in my gaming room. So far it seems to be good but I want to wait longer. If it goes well I am going to buy 3 more at 2200K to get the more yellow look but the Lumens are higher so it produces more light. If I can still game for hours on end which I usually do it once a week every Sunday night. I will know they are good. I believe for me PWM is the worst and now I know CRI isn't the issue because from what I understand all LED screens use crappy CRI 80 lights or you can't even find the information. I think I might see even more improvement with flicker free LED bulbs because the Incandescents I use have minor flicker so I will essentially be in a room with no flicker when I'm gaming.

              deepflame I remembered one other thread on Blur Busters where most of the people were complaining about the 240HZ monitors. Most people are waiting for the second generation of panels to come out which probably should be this year. A few have been mentioned already but that doesn't mean they will be good. Wait for these guys and other sites to test them first.

                jasonpicard

                I am trying to understand how refresh rate could be such a problem and wonder if it's an issue why most of us do better with old CCFL displays and old LED phones, which are 60hz but stil LCDs and drawing the same way regardless of backlight. Do you have a theory as to why? Also why are the 240hz displays being complained about so much if faster refresh is better? I may be missing some fundamental bit in all the info so apologies if you have already said that.

                As @deepflame asked I have major problems even on websites with no apparent movement. I don't game or have the hardware for it so can't compare motion blur in that regard.

                  jasonpicard The link takes me to a 5yo post with 2017 as the most recent reply. Did you mean to link another one?

                    hpst I will try my best to answer this. https://forums.blurbusters.com/viewtopic.php?f=10&t=5327&start=30 On Blur Busters there is an extensive list of information from 2013 till now. This quote is taken from the head guy at blur busters on this link I just posted

                    "Some displays have scan-converting electronics, especially when panel scanout velocity or direction diverges from cable scanout. I know that some panels are fixed-horizontal-scanrate (e.g. BenQ XL2540 240Hz and XL2735 1440p) while some panels are variable-scanrate (e.g. BenQ XL2720Z 144Hz@1080p). So that's why the XL2540 has crappy 60Hz lag, it framebuffers the slower scanning 1/60sec cable signal before the fast 1/240sec scanout.

                    Some displays scan in a different direction (bottom to top), or require full-framebuffer preprocessing, so they have to framebuffer the full refresh cycle before beginning to scanout. RTINGS found a few HDTVs that scanned bottom-to-top. Additionally I know all DLP projectors & plasma displays split a refresh cycle into multiple subfields (although the two technologies do it very differently, each refresh cycles are still effectively split up into lower-bit-depth dithered subfields) -- so DLP/plasma mandatorily framebuffers a full refresh cycle before beginning the first subfield output.

                    If it is TN, the GtG lag is insignificant enough statistically (it becomes more of a visual complaint, and no longer a lag complaint -- like overshoot artifacts, like 1 pixel bright overshoot artifact at 2000 pixels/sec for imperfect 0.5ms overdrive -- still a human-visible ghosting).

                    Just another error margin item to pay attention to.

                    I believe it is a bunch of different reasons. What you are explaining to me is just your take. I have learned that everyone sees things different. What you are saying doesn't apply to me. I can't use any CCFL screen or any LED screen unless I can get the motion blur to an acceptable level. Also I think screen brightness might have some sort of effect on me to. I seem to like dim screens. I'm mainly trying to point out rules that a lot of people seem to over look that apply to every LED and CCFL. CCFL uses Phosphor which to my understanding is softer on the eyes. The head guy at Blur Busters told me that LED doesn't use phosphor on the blue part of the spectrum only the red and green. This makes it harder for a lot of people to process. CRT and plasma use phosphor which Plasma is super easy on my eyes and Sony Wega CRT is super easy on my eyes as well. I believe he said 120 LED with BFI was compared to 75 HZ CRT. Even though BFI induces flicker it's not the same as PWM flicker. Users on his site who can't use monitors with PWM seem to perform great with BFI. It is all user dependant. I think CCFL depending on the panel will draw to screen differently. It seems for me TN works better for me. Another lesson I learned was if you can use Plasma/CRT or DLP dither is not your issue. As the dither on those 3 things is way worse then LED could produce. In the one link in the other thread with the dither he gives multiple examples. He even mentions how a 165HZ IPS is the ultimate if you think flicker or dither is your main problem. TN I believe has the worst dither and I think my monitor I mentioned that works for me is 6 Bit + FRC but clearly that is not my issue. I can't use any CCFL or any LED except flicker free OLED phones. I currently use a Yotaphone 2 and used to own a Samsung S2. I'm planning on buying one of the new DC dimmed oled phones soon. I want a higher refresh rate one though. The nubia gaming phone is 90 hz dc dimmed OLED. OLED follows the same rules with motion blur as well. Just because the colors can change fast on OLED the MPRT time is slow at 60hz. Again it creates the sample and hold issues (Eye Tracking) which for me is probably my main reason for eye strain or one of them.

                    hpst Sorry it is an old post but it's got incredible information on it.

                    • hpst replied to this.

                      jasonpicard You had mentioned a new post started so I thought it might have been the wrong link.

                      I wasn't familiar with BFI. Is this a TV only thing or are computer displays inserting these frames based on GPU instructions etc?

                      I don't watch TV nor do I have access to Plasma or CRT really so cannot test that theory easily but would be interested to know. I have a PWM free laptop that is miserable, and heavy PWM ones that is ok, so that in itself clearly isn't my problem. I cannot pin down one offending factor that always hurts or helps.

                        dev