Sorry I don't really know. I'm just still fascinated that anything OTHER than the monitor could be the culprit.
Safe hardware PC builds list
MrLoco OTHER than the monitor could be the culprit.
Mr. SunnyCove from 4pda also mentioned there could be case when motherboard set dithering in monitor
For example, my monitor doesnt support 10 bit, but MB send unsupported control signals for it, or send 10 bit data, which cause "sharp" and bad monitor behavior
Same eyestrain I met when plug 2021 year monitor, xiaomi mi 27 2k... a little reduced feelings, but same pressure. Today I take a day without testing bad z690 coz I waked up with headache
Can new (2020+ year) motherboards make custom area screen refresh?
I noticed that problems appear during reading (static). And if you launch the calculator application and scroll it around the entire screen, the pressure on the eyes stops for about 3 seconds. So gamers can not detec problem at all
Are there utilities like screensavers that can emulate a small action across the entire screen without interfering with reading?
My z690 experience:
- Turn off all ASPM in BIOS ( some of them enabled by default )
- Turn off all SpeedStep and SpeedShift in BIOS
- Disable Above 4G Decoding and ReBAR
- Old nvidia driver 466.67 - only DisplayDriver, all settings by default ( ColorRange = Full, 8 bit, RGB )
- Change ColorControl's dithering mode from Auto: Disabled to Disabled
- Set LowLatencyMode = Ultra and Power management mode to Prefer Maximum Performance in nvidia Control Panel
- Disable 10bit, 4:2:2, 4:2:0 support, bt2020 support of monitor ( stay only bt709 ) and delete all HDR mapping data in CRU
- Set windows powerplan to Max Perfomance
This all significantly reduce eye-strain ( to be honest, between brows pressure )
Here is 8 settings I changed, lets discuss which is important and which is not. I also notice problem more clear when reading ( static background ) than watching vids
- Edited
@"photon78s"
All videos are in 60fps and contain 60hz monitor recording. You need to see vids frame-by-frame to get FRC pattern. Interesting fact is: in all specs monitor is 8 bit, but in datasheet it send only 6 bit data into panel
- 6-bit data input: you can see only pixel inversion: https://filetransfer.io/data-package/eqe6MFii
- 8-bit data input: you can see 4 repeating patterns ( oldschool traditional FRC type, modern A-FRC is very different ): https://filetransfer.io/data-package/9aej86qS
- 8-bit data + spatial-temporal dithering via Ditherig: https://filetransfer.io/data-package/l39tphOQ
Its hard to see FRC patterns in 8bit + dithering data, coz signal become very noisy. Imagine what happens with letters, one side of which takes only 1 vertical pixel row? They get chaotic non-static shift left-right, which overload eye nerves, to solve such problem ppl set lower resolution to get letter more wider/boldy and keep static elements inside of letter elements ( when edge elements of letters would keep dithering )
In my safe gigabyte z390d motherboard, I am afraid to update BIOS to next version coz you cannot rollback to previos one after update. This update introduce capsule BIOS and re-bar ( CPU can access all GPU's videomemory ) function of nvidia. I am not sure update could be safe for me, at least z690 motherboard dont have it in patchnote so I think z690's BIOS includes re-bar by default
I got some stats here, in ledstrain, regarding user's issues with different laptops. If exclude pure display problem ( oled, amoled, t-pwm screen backlight tech ), mass problems starts with:
1) Intel 11 gen CPU's and newer: 8/9 gen is good - my own experience, no much complains regarding 10 gen of lga1200 but a lot bad reports regarding 11 gen lga1200 and 12 gen lga1700 ( my own experience ). Perhaps 10 gen is affected too
2) AMD 5800h and later series CPU's: 5600h of my mate is good, 4600h is also good ( my own experience )
- Edited
6bit + FRC = 8 bit = 4 repeating pattern frames to get extra 2 bit color data
6bit + Hi-FRC = 10 bit = 8 repeating pattern frames to get extra 4 bit color data
At 60hz Hi-FRC very noticable, eye got strain, thats why manufactures increase framerate upto 120+hz to smooth effect. The question is: manufactures present 120hz as good for games and eyes, but can they "mask" false 8 frames?
At 120hz panel you get only 120/8 = 15 real frames, all others are "flickers"
At 144hz panel you get 144/8 = 18 real frames
Here is Hi-FRC monitor review:
"I worked on a laptop for about five years and had no problems with my eyes at all. My vision has always been very good, a little sharper in the distance. And so I wanted to buy myself a larger screen to make working with the code more comfortable. My choice fell on the Lg 27Ul850-W monitor, since it has Type-C for my MacBook. At first, I didn’t feel any discomfort while working, fatigue in my eyes began to appear after two weeks, but since I didn’t experience any problems with my eyes at all, I didn’t understand what was happening at all and chalked it up to general fatigue.
As a result, after two months of torment with settings, different calibrations, nothing changed, my eyes began to hurt wildly. What I have now: I can’t look at any screen at all, my eyes burn, my head hurts, etc. I can read the news for no more than five minutes. I went to the ophthalmologist and threw out a 10 for an examination. A diagnosis of accommodation was made and glasses were prescribed for unloading near vision; the glasses also have a blue filter. I tried to sit at this monitor with glasses on, the situation did not change, although the glasses really take off the load in close proximity and it became noticeably noticeable on the laptop. As a result, I switched back to my laptop, my eyes began to recover, and Irifrin was also prescribed for a month.
I just can’t understand what’s wrong with this monitor, I turned down the brightness and contrast, I did a pencil test and didn’t notice the pwm, I also turned on slowmo at 250 fps on the iPhone, I also can’t see the pwm, in short, it’s some kind of fantasy. My wife thought that I just had sensitive eyes, she worked on it for a week and in the end her eyes also hurt. In general, I do not recommend this model. After all this, I was generally disappointed in monitors from LG."
In specifications, they wrote 8bit panel as my Benq, but the truth is he 6bit + FRC ( 4 patterns ), in datasheet summary you can see " panel can display 16.7m of colors, 8bit input ", but in scheme I post above you can see only 6bit data sent to panel, keeping 8bit input
Who know model of real 8bit monitors? My experience talks, you can detect it only with microscope or telezoom camera lens keeping same FPS which panel refresh
When you look modern panels at low-freq refresh rate, some vertial rows all across the screen could be seen. I think this is vcom screen optimisation - in detail, it is green vertical subpixels in row. When you switch panel to 120+hz, those rows still there, but not visible for eye. Have a look scheme out of datasheet:
simplex interested in looking at these but they just expired today, can you reupload them?
DisplaysShouldNotBeTVs can you reupload them?
which one?
- Edited
All monitors were in 8bit 60hz mode with cleared EDID settings in z390 motherboard + rtx 3080 graphic card, video is 4k60p
!!! you need to x8 slowdown to see FRC difference !!!
AOC Q27B3MA - 2k60hz
https://cloud.mail.ru/public/g6vi/uG9gdWZ3r
benq bl2420z - FHD 60hz ( AUO m238hvn01.0 )
https://cloud.mail.ru/public/nDaD/yy71Zr9kK
xiaomi mi 27 2k 165hz ( AUO M270DAN02.B )
https://cloud.mail.ru/public/9w6Q/y7qSgnnvm
philips 275V8LA - 2k60hz ( TPM270WQ1-SG1G012 )
https://cloud.mail.ru/public/cqgm/xgmLS7Sgg
=======================================================
Upd:
And here is how simple dithering looks ( z690 + iGPU + benq in 6-bit mode )
https://cloud.mail.ru/public/anaU/kahSkroTa
And how classic 4 pattern FRC dithering look ( z690 + iGPU + benq in 8-bit mode )
https://cloud.mail.ru/public/H3BJ/7KjVMiJcA
=======================================================
Upd2:
3 video cards in z390d + benq 8-bit
The gtx1060 has the most visible FRC patterns in terms of sharpness, the rtx2/3 gen have stronger dithering (hides the FRC patterns) and overall image "blurriness"
- Edited
6bit + FRC = 8 bit = 4 repeating pattern frames to get extra 2 bit color data
6bit + Hi-FRC = 10 bit = 8 repeating pattern frames to get extra 4 bit color data
Are you sure this is the case? As far as I've read up:
6bit + FRC = 16.2m
6bit + HiFRC = 16.7m
As per http://www.lagom.nl/lcd-test/black.php at least.
At 60hz Hi-FRC very noticable, eye got strain, thats why manufactures increase framerate upto 120+hz to smooth effect. The question is: manufactures present 120hz as good for games and eyes, but can they "mask" false 8 frames?
At 120hz panel you get only 120/8 = 15 real frames, all others are "flickers"
At 144hz panel you get 144/8 = 18 real frames
Can you quote the source? I'm confused what you mean by this.
qb74 Are you sure this is the case? As far as I've read up:
Yes
Moreover, I suppose most of 8/10bit monitors are 6bit+Hi-FRC, the bad is my camera is only 60p so I cant check monitor's FRC in 120hz mode to find 8 "blended" frames. On my benq I detected 4 FRC pattern frames, in xiaomi its hard to detect FRC patterns due it has A-FRC (not classical FRC patterns)
- Edited
z690 + iGPU 6-bit vs 8-bit + benq
At 6-bit no FRC at all, only monitor's dithering
https://cloud.mail.ru/public/UhWz/kYGUfMPvr
Very interesting, I cant get same no FRC image when using gtx1060/rtx2070s/rtx3080 cards even forcing monitor to 6-bit. I get banding at wallpaper, but FRC still there with dGPU
I mean, this FRC patterns are always in 6 or 8 bit monitor mode -> https://cloud.mail.ru/public/dMBD/ZU1XN4T7e
- Edited
I was not able to spot any flicker happening on the videos you've recorded.
It's a bit hard to capture this without a oscilloscope + probe, cameras do not suffice for this matter sadly.
- Edited
qb74 I was not able to spot any flicker happening on the videos you've recorded.
Talking this video -> https://cloud.mail.ru/public/UhWz/kYGUfMPvr
You need to download video, then play video frame by frame - in bottom (8-bit label) part of the video you will see 4 FRC repeating patterns (vertical, horisontal, vertical, smooth) looking at taskbar, to the left position of the "up arrow" symbol.
In 6-bit there is no patterns, only monitor's dithering random noise on taksbar visible.
In 8-bit case, on the dithering noise layer you seen in 6-bit part, monitor add FRC layer with 4 patterns ( through the FRC patterns you can notice same 6-bit dithering noise structure )
- Edited
simplex
Mate, do you not understand that my panel can dither and we cannot have the same experience as what you view IRL?
Oscilloscope + (fast) light probe is the objective way to measure this, not some random videos.
ensete
There is a "safe" gudeline I'd say for monitors.
But, ppl here just need to start evaluating biological factors instead though instead of blaming everything on the monitor.
A safe monitor consists of:
- Having least amount of flicker possible (can be easily troubleshooted with a fast oscilloscope + light probe)
- Highest refresh rate possible (sub-360hz is too blurry)
- Highest resolution possible (if possible, opt for high refresh laptop panels due to PPI)
- Does not use KSF phosphor / QDEF backlight (both seem to have reports of causing issues for people), instead employs regular LED, CCFL backlight or WOLED (seems to be fine for some people, only caveat is brightness dip at refresh rate cycle)
- Avoiding high brightness (100-150 nits is fine for daytime use, personal preference territory)
- Using warmer whitepoint (D55 / 5500k is nicer than 6500k or colder, but this is personal perference)
There is a gray area however, which is very individual + not a lot of info on this.
- Matte/Glossy considerations (some work for ones, not for others)
- Polarizer orientation (some orientations work better for ones, not for others)
- Dithering technique used (temporal, simple temporal or static, I've seen reports of static being horrendous for some)
People here use extremely archaic & ridiculous panels imo.
There are outside factors such as:
- Light in room (bulbs are notorious for flicker)
- Biological/psychological factors (vitamin/mineral defficiency, stress levels, hormonal disbalances, diet etc.)
- EMF's (yes, they play a major role in health)
qb74 not some random videos.
Main issue of modern monitors not in dithering or FRC, issue is vcom adjustment which prevent pixel sticking
Looking in my vids you will see different vcom in each monitor, you dont need microscope to record vcom: 60hz monitor, 60p camera and good zoom is enough ( the main rule - camera should gives you high res picture and much more FPS record than monitor did, better to sync record speed )
There are many influencing factors that can be listed, but to check each of them, you need to take measurements.
I believe that monitor problems can be reduced by following these rules:
1) No more than 60/75Hz (I saw a hard vcom in all high-frame rate monitors)
2) Without HDR and excessive brightness (<= 250 nits)
3) Year of manufacture 2014 ... 2020
4) IPS or VA panel does not matter
5) wled backlight without quantum and other technologies of extended coverage, giving sRGB coverage of about 90 ... 100%
6) Monitor coating - very light matte (hard-matte blurs the structure of RGB subpixels and strains the eyes)
7) Almost all monitors are now without low-frequency PWM (> 10 kHz)
8) Almost all monitors have a blue peak at 446 ... 460 nm, this is safe
9) BASIC color temperature - around 6500k, when looking at the monitor there should be no be cold shades
10) DP or HDMI port - does not matter. Both can dither, depending on the EDID settings