• Hardware
  • Could lower refresh rate be better? 240 vs 120 vs 60

I've had many monitors and the conclusion was always the same, even when I "only" had 144hz monitor I would drop down the refresh rate to 100hz, now I do the same with 240hz, but all the sources always say the higher refresh rate the better it should be for all people, including "us".

Could it be an issue with the GPU? RAM? Windows? Or is there an actual argument against higher refresh rates?

I, anecdotally, like to only use 60Hz panels. I don't like how the new "every day" use BENQ monitors are defaulting to 100Hz now. They can be stepped down to 60Hz in the Windows settings so I'm told from BENQ support.

The issue I’ve noticed that I have with panels that are over 60Hz is that they rarely stay at 120Hz, for example. The MBP line is notorious for this with ProMotion. There’s a lot of downshifting/throttling going on, especially in laptops. In that case it’s worse.

I bumped into a regional manager at Best Buy who knew all about PWM. We talked for a while and he said that I needed something “faster than my nervous system” so that I wouldn’t be symptomatic. He recommended 240 if possible.

I’m not sure if I buy that because I also find 60Hz more comfortable. I also think a lot of factors like dithering, blue light, text rendering, and a general sensitivity to lights complicates this. I looked at one of the recommended high refresh rate monitors and it was just overkill at 32” and way too bright.

It also depends what tasks you’re going to be performing. Gaming, video editing, music production, and coding are vastly different. So maybe certain use cases are better at certain refresh rates.

The answer is clear

All panels are 6-bit

If you set 60hz, the FRC flickering is very low (15..20hz), it acceptable and depend on FRC type used. If you set 50hz, it would be more acceptable

All high freq monitors (120hz) increase FRC freq to 60hz, which is very fast and overload nerves (stroboscope), it starts to pixel flicker twice faster

The benq 100hz is marketing, in my gw2790e case, they used old lg 2014 year panel and overclocked it, having vertical line artifacts

If panel have VRR, FRC freq also move up and down, you get nausea more faster than constant refresh rate

  • AshX replied to this.

    simplex

    Is there a standard equation used to determine FRC freq based on refresh rate? Or is it dependent upon the software?

    Anecdotally this makes sense. I tried the Lenovo Yoga Slim OLED. It was most comfortable at the 90Hz selection - both 120 and 60 were uncomfortable. Settings page claimed it was an 8 bit display. It does have PWM, though.

    I did a side by side test of the Apple Studio Display with a Mac Mini and a Mac Studio M3 Ultra. The Mac Studio was infinitely more comfortable than the Mini. The Studio Display is locked at 60Hz so I have no idea what is going on with that other than a difference in dithering algorithm or that the Macs are genuinely not able to consistently run 60Hz and are throttling.

      AshX It does have PWM, though

      PWM is another factor. Imagine, your backlight LED flicker, then pixels on the screen also flickers, each has own frequency. With PWM-free screen, you get only pixel flickering (FRC)

      AshX Is there a standard equation used to determine FRC freq based on refresh rate?

      Nope, I did measurments by myself with panels. First I run it in 6-bit 60hz mode, made 4k60 video record, then run panel in 8-bit mode (and found FRC patterns), then run panel in 6-bit + GPU dithering (spatial, dynamic, temporal), and found difference. 4 repeating FRC patters at 60hz screen mean 15hz flicker

      Another flickering factor is pixel-inversion (4 types), but it doesnt impact from my experience, it has screen_refresh_rate/2 formula, and provides a uniform fill

      AshX claimed it was an 8 bit display

      Best way to check is display true 8-bit, to run it in 6-bit keeping disabed GPU dithering. If you feel better, the monitor is 6-bit with agressive FRC. All monitors/screens 2k75 / 2k165 / 2k120 / FHD 60 / FHD 100 I tested were 6-bit, I am not sure 8-bit exists

      • AshX replied to this.

        simplex Best way to check is display true 8-bit, to run it in 6-bit keeping disabed GPU dithering. If you feel better, the monitor is 6-bit with agressive FRC. All monitors/screens 2k75 / 2k165 / 2k120 / FHD 60 / FHD 100 I tested were 6-bit, I am not sure 8-bit exists

        This is very interesting.

        It seems like the effect is cumulative, then. Perhaps there is a threshold unique to each individual which they can tolerate. Why were many of us able to tolerate less-than-perfect monitors in the 2000s/early 2010s I wonder.

        In theory, then, software updates to the OS or GPU could alter the dithering frequency and/or intensity? I wonder why many update the OS on both phones and computers and suddenly the device causes discomfort.

        There is too much obfuscation for the sake of marketing and sales. It’s clear Apple is the worst offender the past decade.

          AshX software updates to the OS

          This

          They can change blend type (i.e. dithering)

          For example, old OS use in grapfic pipeline only spatial (per each frame) dithering. Nowadays, with more powerfull GPU, they can run temporal (spatial but keeping more framebuffer). Or use double/triple method of blending. In theory it work okay, in practice we met chaotic pixel curcus

          Spatial is good. I use it on old asus 6-bit laptop (via GPU), and in new laptop (switch display to 6-bit automatically set Spatial in iGPU). Looks good.

          low refresh reduces the optokinetic effect (refresh scanning, scrolling, games) if you're sensitive to that

          low refresh on oleds also slows the 'backlight' scanning and reduces its optokinetic effect

          dev