Potential correlation between video timings and flicker
Unfortunately, I can't answer that, as my "good sample" is 1080p native. But maybe if this theory holds up through other users testing it, then someone with a different resolution (from a good setup) may share their good parameters too. Even for 1080p, it might have something better out there.
I have an Asus G752VT laptop (i7-6700HQ, GTX 970M Nvidia GFX) from 2016 with a 17.3inch 75hz IPS screen that I am totally comfortable with.
Then, I bought this Asus EyeCare Monitor VA24DQ which is a 24" FHD 75hz monitor.
So the monitor has the same resolution and refresh rate as my laptop, but here I get dizzy and headachey in 1-2hrs of usage.
So how would I go about replicating my perfectly fine laptop display config onto my external Asus monitor?
I just need to install the CRU utility and go from there?
Is it straight forward?
- Edited
Yes, you would have to copy the same parameters I show in my screenshot from your good setup and then create & apply a new custom resolution with those same settings on your external monitor.
If the Nvidia software doesn't have an option for this, I suggest you the CRU software.
*Please turn off Freesync/Gsync before doing this, otherwise it may fail. You can turn it back on after you're done.*
machala any serious health condition
but most people are not suffered from this…and only we are(a very small minority). How could your theory explain this?
- Edited
arturpanteleev Doesnt that apply to all theories on this board? Probably some variation in brain function that only affects a small population (lower threshold in processing flicker as movement, inability to compensate for it), maybe in combination with some binocular vision dysfunction. Perhaps think of it like the also relatively small group of those idiopathic epilepsy whose nerves are more easily "overexcited" and an even smaller subgroups of those affected are sensitive to flicker as a trigger for a seizure.
Timing formulas are kind of strange in a way because they're legacy artifacts of back when electron gun CRTs were around, that's where the polarity stuff comes from. LCDs don't have a polarity. There's a bunch of formulas but they all have the same object, to draw the raster image from top to bottom.
The main difference between GTF and CVT-RB is that the RB has tighter timings for reduced blanking interval, meaning it needs less pixel bandwidth and doesn't have as long a delay before drawing the next frame.
RB is needed at higher resolutions on links that don't have tons of bandwidth such as DVI or HDMI 1.4.
There is a difference and it is visible depending on the setup, but it's not a night and day difference.
So unfortunately after thorough testing I have to say this is probably a dead end and my conclusions were a little bit premature. I do see differences though with different timings, but I was unable to really fix my new PC that produces an unbearable picture for me using this method. However, thanks to what happened to me I have another finding about monitors that I will speak of in a new thread.
arturpanteleev To be honest I don't think that only a very small minority is affected. In fact I strongly believe that the problem is more major, but many people just don't realize it as a problem. For example I can explain the huge popularity of OS dark themes with nothing than the fact that dark theme reduces flickering (true for PWM on AMOLED for example). Lots of people have never used older Windows (Even older builds of Windows 10) and they have never experienced non-flickering displays and OS. So they accept as a fact that their eyes get tired quickly from computer and practice that bullshit techniques like 20-20-20, enable nigh shift, wear yellow glasses etc. I have been staring at computer screens for hours since the beginning of 2000-s and have never ever experienced any kind of eye strain until I have updated to latest Windows 10 and to MacOS Monterey on my Macbook last year.
I don't use dark themes to reduce flickering, I use dark themes because staring at a bright white rectangle is tiring, and sometimes can induce a migraine. I find it harder to read text on a white background because it's just too much contrast having a little bit of black or blue in a sea of illuminated white.
Sure, I used to use white themes before, but displays used to be smaller and they also didn't have very high white points to make them more blue which is common now.
- Edited
On the contrary I am personally convinced that those "bright whites" are caused by the flickering effect. Many people complain with today's monitors that the white is to bright, too sharp, they decrease brightness, change color temperature to no avail. Just because there is no help there.
I remember very well in the CRT days that lower Hz screens had very "sharp" image, it was hurting the eyes and the white seemed too bright. Which was less bright with increased refresh rate.
Here with LCDs and potentially PC sources of flickering, the flickering pattern is different compared to CRTs where the whole image was being redrawn, so it is again less apparent and has different effect on our vision.
I can tell you that there are PCs and screen combination where it is perfectly fine for me to stare at a white screen for ages. In many other cases I can't stand it for long… and there are examples where I cannot stand it for even a few seconds. And it is NOT about brightness and I can change the RGB profile on such screens to make whites look yellow. But it does not change a thing.
And what you describe confirms what I say: if you had a problem with too bright white, you would simply decrease brightness as the first obvious option. Rest assured that a monitor with a reasonable brightness setting is way less light exposure that a normal bright day outside. But you don't change brightness instead… simply because it does not solve it for you while it should.
No, I leave my brightness at 100% on my monitor, because PWM. But it's fine, because it's probably 13 years old already, so the ccfls will have dimmed a little by now and back then they didn't have superbright tubes anyway.
I'm old enough that I still have a green phosphor monitor in the basement.
Yes, the pattern was different. With CRTs it was a single narrow line moving from the top down. I imagine at higher refresh rates like 75 to 100 Hz (I liked 100) the line wouldn't remain in the same spot as long so perception would have been affected, but in terms of power illuminance, I doubt there would have been any difference. No backlight at all, just a line. So that's probably part of why some people like OLED displays, they don't have backlights either. With LCD you have the crystals moving into different orientations in front of the backlight which can also be doing its own thing. Fluorescent manufacturers were originally at 60 Hz but then later moved to electronic ballasts which allowed them to do 20 kHz which was a big difference. I replaced some magnetic ballasts with electronic ballasts in the basement myself and I could see a noticeable difference. The electronic ballasts were hugely better. Lots of factors at play with this issue.
All I am saying here is that I don't think flicker is the be-all end-all. I mean, I am fine with my OLED phone even with flicker. I just have it at a comfortable brightness and colour temperature.
Temporal dithering, I agree is more problematic and my monitor does have that built into the panel since it's a TN.
I think the best thing to look for is probably less-spiky spectrographs. Like, don't have a giant blue peak and very low green or red or whatever. Everything else has to fight harder to compensate. It would be better if there were more than just 3 peaks, and the height of each was reasonably close to each other.