Hey folks,

I suggest making a list of safe workstations, to share it with each other, because z690p MB seems having eye-strain in dGPU mode ( gtx1060 / rtx2070s / rtx3080 ) and now I am stucked in upgrade options.

In iGPU mode ( i5 12600k + UHD 770 ) - all is OK. Problems starts only with grapfic cards in pci-e port

I replaced 2 monitors, 3 grapfics cards, finally I made 2 PC's: z390 and z690 based with similar win10 build and grapfic card driver config

I tried to use old 500w power supply which used w z390d MB, nothing changes

Please write each details, they are important! Motherboard BIOS can be seen in CPU-Z, graphic card BIOS - in GPU-Z

Here is my safe workstation:

CPU: i5 9600kf with motherboard's option "CPU upgrade" = Gaming (+1 max turbo ratio)

MB: Gigabyte z390d ( F2 bios )

RAM: DDR4 CMK16GX4M2B3200C16 (4x8gb 2133mhz CL11 based -> 3200mhz Cl16 using XMP)

Graphics: Asus RTX 2070s ( 90.04.76.00.F2 bios, nvidia 466.63 driver )

Monitor: Benq bl2420z ( DisplayPort connection, auo m238hvn01.0 panel )

OS: win10 Enterprise LTSC x64 Build 17763.1098


update part #1: amd 780m dithering research


1. AMD by default ( driver_default state ) use dithering in all color depth, according to monitor specs (EDID)

some evidence from linux core:

switch (bpc) {
 case 6:  
if (dither == AMDGPU_FMT_DITHER_ENABLE) { /* XXX sort out optimal dither settings */   
   tmp = REG_SET_FIELD(tmp, FMT_BIT_DEPTH_CONTROL, FMT_FRAME_RANDOM_ENABLE, 1);
   tmp = REG_SET_FIELD(tmp, FMT_BIT_DEPTH_CONTROL, FMT_HIGHPASS_RANDOM_ENABLE, 1);
   tmp = REG_SET_FIELD(tmp, FMT_BIT_DEPTH_CONTROL, FMT_SPATIAL_DITHER_EN, 1);   
   tmp = REG_SET_FIELD(tmp, FMT_BIT_DEPTH_CONTROL, FMT_SPATIAL_DITHER_DEPTH, 0);
} else {   
   tmp = REG_SET_FIELD(tmp, FMT_BIT_DEPTH_CONTROL, FMT_TRUNCATE_EN, 1);
   tmp = REG_SET_FIELD(tmp, FMT_BIT_DEPTH_CONTROL, FMT_TRUNCATE_DEPTH, 0);  
} break; 

case 8:
if (dither == AMDGPU_FMT_DITHER_ENABLE) { /* XXX sort out optimal dither settings */
   tmp = REG_SET_FIELD(tmp, FMT_BIT_DEPTH_CONTROL, FMT_FRAME_RANDOM_ENABLE, 1);   
   tmp = REG_SET_FIELD(tmp, FMT_BIT_DEPTH_CONTROL, FMT_HIGHPASS_RANDOM_ENABLE, 1);
   tmp = REG_SET_FIELD(tmp, FMT_BIT_DEPTH_CONTROL, FMT_RGB_RANDOM_ENABLE, 1); 
   tmp = REG_SET_FIELD(tmp, FMT_BIT_DEPTH_CONTROL, FMT_SPATIAL_DITHER_EN, 1);
   tmp = REG_SET_FIELD(tmp, FMT_BIT_DEPTH_CONTROL, FMT_SPATIAL_DITHER_DEPTH, 1);
} else {
   tmp = REG_SET_FIELD(tmp, FMT_BIT_DEPTH_CONTROL, FMT_TRUNCATE_EN, 1);   
   tmp = REG_SET_FIELD(tmp, FMT_BIT_DEPTH_CONTROL, FMT_TRUNCATE_DEPTH, 1);
}  break;


case 10:  
if (dither == AMDGPU_FMT_DITHER_ENABLE) { /* XXX sort out optimal dither settings */
   tmp = REG_SET_FIELD(tmp, FMT_BIT_DEPTH_CONTROL, FMT_FRAME_RANDOM_ENABLE, 1);   
   tmp = REG_SET_FIELD(tmp, FMT_BIT_DEPTH_CONTROL, FMT_HIGHPASS_RANDOM_ENABLE, 1);
   tmp = REG_SET_FIELD(tmp, FMT_BIT_DEPTH_CONTROL, FMT_RGB_RANDOM_ENABLE, 1);   
   tmp = REG_SET_FIELD(tmp, FMT_BIT_DEPTH_CONTROL, FMT_SPATIAL_DITHER_EN, 1);
   tmp = REG_SET_FIELD(tmp, FMT_BIT_DEPTH_CONTROL, FMT_SPATIAL_DITHER_DEPTH, 2);  
} else {
   tmp = REG_SET_FIELD(tmp, FMT_BIT_DEPTH_CONTROL, FMT_TRUNCATE_EN, 1);   
   tmp = REG_SET_FIELD(tmp, FMT_BIT_DEPTH_CONTROL, FMT_TRUNCATE_DEPTH, 2);
}  break;

2. Comparing results, I found 780m set exact same registry settings in driver_default (0x0000C900) also as in dith8_no_frame_rand (0x0000C900)

3.AMD use same dithering settings in vega6, vega7, 780m. Here is list of possible values:

0x0000C900 default
0x00002023 fm6
0x00002000 fm8
0x00002000 fm10
0x0000E123 dith6
0x0000E900 dith8
0x0000F100 dith10
0x0000C123 dith6_no_frame_rand
0x0000C900 dith8_no_frame_rand
0x0000D100 dith10_no_frame_rand
0x00002001 trun6 <<<
0x00002011 trun8
0x00002023 trun10
0x0000E111 trun8_dith6
0x0000E921 trun8_dith8
0x00002011 trun8_fm6
0x0000E900 dith8_fm6
0x0000E121 trun10_dith6
0x00002021 trun10_fm8
0x0000E921 trun10_dith8_fm6
0x0000F100 dith10_fm8

4.Kawamoto's ditherig do not set 0x0 values when you press "Disable all dithering" in his app

For my personal taste, setting all values to 0, not works at all. The best eye-settings for my 6-bit + FRC monitor, I currently found, is: 0x00000011. To set same settings, you need to:

A) Download ditherig

B) Open in amd64 folder database.csv, and replace in 89..94 lines from

0x00073900,0x00000000

to (8-bit truncate only)

0x0000C911,0x00000011

or (no settings at all)

0x0000C911,0x00000000

5.As many ppl there noticed, some apps reset dithering settings when using - yes, it is true. Here is apps I tested for AMD driver reset:

Reset driver settings to default: Chrome, MS Office, OBS studio, Zoom Workspace

Do not reset driver settings: DXO Photolab, Firefox, Telegram Desktop, WinRAR, Win10 notepad / calc / paint /photoviewer, Visual Studio 2022, MPC-BE 1.8.0

6.Without installed AMD driver, win10 with Microsoft Basic Display Driver also shows 0x00000000 (no dithering applyed). When you install any AMD driver, dithering settings changes to 0x0000C900 (driver_default). I feel strain even without AMD driver. It means, the strain issue is hardware-based, and cannot be controlled via driver (or registers) settings

    https://filetransfer.io/data-package/GaHolmgv

    Here is comparison of pixel moving, 60fps record allow to reduce speed to 12.5% in your player to find same repeating frame-patterns I synchronized, better to navigate frame-by-frame (Ctrl + Right keys in MPC-BE player) to get this pattern:

    1. Vertical
    2. Horizontal
    3. Vertical
    4. Smooth

    According to monitor's 60hz refresh rate, this is 15hz 4 frame cycle, and no difference in them between z390 or z690 motherboards (same driver version, same settings, same win10 1809 build, dithering in nvidia disabled via ColorControl, in z690 - via Ditherig)

    But z690 gives insta eye-strain even in BIOS, 2022 or 2024 BIOS versions doesnt matter

    Who knows how to reduce image sharpness in motherboard?

    The issue could be extra sharp rendering, which made monitor's pixel inversion/dithering very visible, then eyes get tired

    Sorry I don't really know. I'm just still fascinated that anything OTHER than the monitor could be the culprit.

      MrLoco OTHER than the monitor could be the culprit.

      Mr. SunnyCove from 4pda also mentioned there could be case when motherboard set dithering in monitor

      For example, my monitor doesnt support 10 bit, but MB send unsupported control signals for it, or send 10 bit data, which cause "sharp" and bad monitor behavior

      Same eyestrain I met when plug 2021 year monitor, xiaomi mi 27 2k... a little reduced feelings, but same pressure. Today I take a day without testing bad z690 coz I waked up with headache

      7 days later

      Can new (2020+ year) motherboards make custom area screen refresh?

      I noticed that problems appear during reading (static). And if you launch the calculator application and scroll it around the entire screen, the pressure on the eyes stops for about 3 seconds. So gamers can not detec problem at all

      Are there utilities like screensavers that can emulate a small action across the entire screen without interfering with reading?

      My z690 experience:

      1. Turn off all ASPM in BIOS ( some of them enabled by default )
      2. Turn off all SpeedStep and SpeedShift in BIOS
      3. Disable Above 4G Decoding and ReBAR
      4. Old nvidia driver 466.67 - only DisplayDriver, all settings by default ( ColorRange = Full, 8 bit, RGB )
      5. Change ColorControl's dithering mode from Auto: Disabled to Disabled
      6. Set LowLatencyMode = Ultra and Power management mode to Prefer Maximum Performance in nvidia Control Panel
      7. Disable 10bit, 4:2:2, 4:2:0 support, bt2020 support of monitor ( stay only bt709 ) and delete all HDR mapping data in CRU
      8. Set windows powerplan to Max Perfomance

      This all significantly reduce eye-strain ( to be honest, between brows pressure )

      Here is 8 settings I changed, lets discuss which is important and which is not. I also notice problem more clear when reading ( static background ) than watching vids

      6 days later

      @"photon78s"

      All videos are in 60fps and contain 60hz monitor recording. You need to see vids frame-by-frame to get FRC pattern. Interesting fact is: in all specs monitor is 8 bit, but in datasheet it send only 6 bit data into panel 🙂

      1. 6-bit data input: you can see only pixel inversion: https://filetransfer.io/data-package/eqe6MFii
      2. 8-bit data input: you can see 4 repeating patterns ( oldschool traditional FRC type, modern A-FRC is very different ): https://filetransfer.io/data-package/9aej86qS
      3. 8-bit data + spatial-temporal dithering via Ditherig: https://filetransfer.io/data-package/l39tphOQ

      Its hard to see FRC patterns in 8bit + dithering data, coz signal become very noisy. Imagine what happens with letters, one side of which takes only 1 vertical pixel row? They get chaotic non-static shift left-right, which overload eye nerves, to solve such problem ppl set lower resolution to get letter more wider/boldy and keep static elements inside of letter elements ( when edge elements of letters would keep dithering )

        In my safe gigabyte z390d motherboard, I am afraid to update BIOS to next version coz you cannot rollback to previos one after update. This update introduce capsule BIOS and re-bar ( CPU can access all GPU's videomemory ) function of nvidia. I am not sure update could be safe for me, at least z690 motherboard dont have it in patchnote so I think z690's BIOS includes re-bar by default

        I got some stats here, in ledstrain, regarding user's issues with different laptops. If exclude pure display problem ( oled, amoled, t-pwm screen backlight tech ), mass problems starts with:

        1) Intel 11 gen CPU's and newer: 8/9 gen is good - my own experience, no much complains regarding 10 gen of lga1200 but a lot bad reports regarding 11 gen lga1200 and 12 gen lga1700 ( my own experience ). Perhaps 10 gen is affected too

        2) AMD 5800h and later series CPU's: 5600h of my mate is good, 4600h is also good ( my own experience )

        6bit + FRC = 8 bit = 4 repeating pattern frames to get extra 2 bit color data

        6bit + Hi-FRC = 10 bit = 8 repeating pattern frames to get extra 4 bit color data

        At 60hz Hi-FRC very noticable, eye got strain, thats why manufactures increase framerate upto 120+hz to smooth effect. The question is: manufactures present 120hz as good for games and eyes, but can they "mask" false 8 frames?

        At 120hz panel you get only 120/8 = 15 real frames, all others are "flickers"

        At 144hz panel you get 144/8 = 18 real frames

        Here is Hi-FRC monitor review:

        "I worked on a laptop for about five years and had no problems with my eyes at all. My vision has always been very good, a little sharper in the distance. And so I wanted to buy myself a larger screen to make working with the code more comfortable. My choice fell on the Lg 27Ul850-W monitor, since it has Type-C for my MacBook. At first, I didn’t feel any discomfort while working, fatigue in my eyes began to appear after two weeks, but since I didn’t experience any problems with my eyes at all, I didn’t understand what was happening at all and chalked it up to general fatigue.

        As a result, after two months of torment with settings, different calibrations, nothing changed, my eyes began to hurt wildly. What I have now: I can’t look at any screen at all, my eyes burn, my head hurts, etc. I can read the news for no more than five minutes. I went to the ophthalmologist and threw out a 10 for an examination. A diagnosis of accommodation was made and glasses were prescribed for unloading near vision; the glasses also have a blue filter. I tried to sit at this monitor with glasses on, the situation did not change, although the glasses really take off the load in close proximity and it became noticeably noticeable on the laptop. As a result, I switched back to my laptop, my eyes began to recover, and Irifrin was also prescribed for a month.

        I just can’t understand what’s wrong with this monitor, I turned down the brightness and contrast, I did a pencil test and didn’t notice the pwm, I also turned on slowmo at 250 fps on the iPhone, I also can’t see the pwm, in short, it’s some kind of fantasy. My wife thought that I just had sensitive eyes, she worked on it for a week and in the end her eyes also hurt. In general, I do not recommend this model. After all this, I was generally disappointed in monitors from LG."

          In specifications, they wrote 8bit panel as my Benq, but the truth is he 6bit + FRC ( 4 patterns ), in datasheet summary you can see " panel can display 16.7m of colors, 8bit input ", but in scheme I post above you can see only 6bit data sent to panel, keeping 8bit input

          Who know model of real 8bit monitors? My experience talks, you can detect it only with microscope or telezoom camera lens keeping same FPS which panel refresh

          When you look modern panels at low-freq refresh rate, some vertial rows all across the screen could be seen. I think this is vcom screen optimisation - in detail, it is green vertical subpixels in row. When you switch panel to 120+hz, those rows still there, but not visible for eye. Have a look scheme out of datasheet:

          20 days later
          24 days later

          All monitors were in 8bit 60hz mode with cleared EDID settings in z390 motherboard + rtx 3080 graphic card, video is 4k60p

          !!! you need to x8 slowdown to see FRC difference !!!

          AOC Q27B3MA - 2k60hz

          https://cloud.mail.ru/public/g6vi/uG9gdWZ3r

          benq bl2420z - FHD 60hz ( AUO m238hvn01.0 )

          https://cloud.mail.ru/public/nDaD/yy71Zr9kK

          xiaomi mi 27 2k 165hz ( AUO M270DAN02.B )

          https://cloud.mail.ru/public/9w6Q/y7qSgnnvm

          philips 275V8LA - 2k60hz ( TPM270WQ1-SG1G012 )

          https://cloud.mail.ru/public/cqgm/xgmLS7Sgg

          =======================================================

          Upd:

          And here is how simple dithering looks ( z690 + iGPU + benq in 6-bit mode )

          https://cloud.mail.ru/public/anaU/kahSkroTa

          And how classic 4 pattern FRC dithering look ( z690 + iGPU + benq in 8-bit mode )

          https://cloud.mail.ru/public/H3BJ/7KjVMiJcA

          =======================================================

          Upd2:

          3 video cards in z390d + benq 8-bit

          The gtx1060 has the most visible FRC patterns in terms of sharpness, the rtx2/3 gen have stronger dithering (hides the FRC patterns) and overall image "blurriness"

          https://cloud.mail.ru/public/dMBD/ZU1XN4T7e

          simplex

          6bit + FRC = 8 bit = 4 repeating pattern frames to get extra 2 bit color data
          6bit + Hi-FRC = 10 bit = 8 repeating pattern frames to get extra 4 bit color data

          Are you sure this is the case? As far as I've read up:
          6bit + FRC = 16.2m
          6bit + HiFRC = 16.7m


          As per http://www.lagom.nl/lcd-test/black.php at least.

          At 60hz Hi-FRC very noticable, eye got strain, thats why manufactures increase framerate upto 120+hz to smooth effect. The question is: manufactures present 120hz as good for games and eyes, but can they "mask" false 8 frames?
          At 120hz panel you get only 120/8 = 15 real frames, all others are "flickers"
          At 144hz panel you get 144/8 = 18 real frames

          Can you quote the source? I'm confused what you mean by this.

            qb74 Are you sure this is the case? As far as I've read up:

            Yes

            Moreover, I suppose most of 8/10bit monitors are 6bit+Hi-FRC, the bad is my camera is only 60p so I cant check monitor's FRC in 120hz mode to find 8 "blended" frames. On my benq I detected 4 FRC pattern frames, in xiaomi its hard to detect FRC patterns due it has A-FRC (not classical FRC patterns)

            I was not able to spot any flicker happening on the videos you've recorded.
            It's a bit hard to capture this without a oscilloscope + probe, cameras do not suffice for this matter sadly.

              qb74 I was not able to spot any flicker happening on the videos you've recorded.

              Talking this video -> https://cloud.mail.ru/public/UhWz/kYGUfMPvr

              You need to download video, then play video frame by frame - in bottom (8-bit label) part of the video you will see 4 FRC repeating patterns (vertical, horisontal, vertical, smooth) looking at taskbar, to the left position of the "up arrow" symbol.

              In 6-bit there is no patterns, only monitor's dithering random noise on taksbar visible.

              In 8-bit case, on the dithering noise layer you seen in 6-bit part, monitor add FRC layer with 4 patterns ( through the FRC patterns you can notice same 6-bit dithering noise structure )

              • qb74 replied to this.

                simplex There is no universal safe hardware list.

                Different people are impacted by different things and identical models of hardware can have different components.

                • qb74 replied to this.

                  simplex
                  Mate, do you not understand that my panel can dither and we cannot have the same experience as what you view IRL?
                  Oscilloscope + (fast) light probe is the objective way to measure this, not some random videos.

                  ensete
                  There is a "safe" gudeline I'd say for monitors.
                  But, ppl here just need to start evaluating biological factors instead though instead of blaming everything on the monitor.

                  A safe monitor consists of:

                  • Having least amount of flicker possible (can be easily troubleshooted with a fast oscilloscope + light probe)
                  • Highest refresh rate possible (sub-360hz is too blurry)
                  • Highest resolution possible (if possible, opt for high refresh laptop panels due to PPI)
                  • Does not use KSF phosphor / QDEF backlight (both seem to have reports of causing issues for people), instead employs regular LED, CCFL backlight or WOLED (seems to be fine for some people, only caveat is brightness dip at refresh rate cycle)
                  • Avoiding high brightness (100-150 nits is fine for daytime use, personal preference territory)
                  • Using warmer whitepoint (D55 / 5500k is nicer than 6500k or colder, but this is personal perference)

                  There is a gray area however, which is very individual + not a lot of info on this.

                  • Matte/Glossy considerations (some work for ones, not for others)
                  • Polarizer orientation (some orientations work better for ones, not for others)
                  • Dithering technique used (temporal, simple temporal or static, I've seen reports of static being horrendous for some)

                  People here use extremely archaic & ridiculous panels imo.

                  There are outside factors such as:

                  • Light in room (bulbs are notorious for flicker)
                  • Biological/psychological factors (vitamin/mineral defficiency, stress levels, hormonal disbalances, diet etc.)
                  • EMF's (yes, they play a major role in health)

                    qb74 not some random videos.

                    Main issue of modern monitors not in dithering or FRC, issue is vcom adjustment which prevent pixel sticking

                    Looking in my vids you will see different vcom in each monitor, you dont need microscope to record vcom: 60hz monitor, 60p camera and good zoom is enough ( the main rule - camera should gives you high res picture and much more FPS record than monitor did, better to sync record speed )

                    There are many influencing factors that can be listed, but to check each of them, you need to take measurements.

                    I believe that monitor problems can be reduced by following these rules:

                    1) No more than 60/75Hz (I saw a hard vcom in all high-frame rate monitors)

                    2) Without HDR and excessive brightness (<= 250 nits)

                    3) Year of manufacture 2014 ... 2020

                    4) IPS or VA panel does not matter

                    5) wled backlight without quantum and other technologies of extended coverage, giving sRGB coverage of about 90 ... 100%

                    6) Monitor coating - very light matte (hard-matte blurs the structure of RGB subpixels and strains the eyes)

                    7) Almost all monitors are now without low-frequency PWM (> 10 kHz)

                    8) Almost all monitors have a blue peak at 446 ... 460 nm, this is safe

                    9) BASIC color temperature - around 6500k, when looking at the monitor there should be no be cold shades

                    10) DP or HDMI port - does not matter. Both can dither, depending on the EDID settings

                    photon78s Interesting discussion here. This link just for learning about VCOM:

                    You mean this patterns are not vcom? Okay, its better to name it "vcom mechanism" or Interlace pattern artifacts which used in pcmonitors reviews

                    I measured today aoc 24b1h, dexp df24n3, Xiaomi Mi Desktop Monitor 1С. All of them have 2-step pixel twinkling ( left - right ).

                    Acer 240y dont have this, its vcom similar to my benq ( 4-step pixel twinkling, twice low frequency compared to new monitors ). I got acer for testing

                      simplex

                      Nope. No assumptions of anything. Just trying to figure this concept out. I some point maybe you could add vcom to this forum's wiki or the glossary section. Btw, is your username "simplex" refering to simplex noise?

                        simplex from the word simple - simpler, simplification of the complex

                        ok I will add after done some tests. I found 2 main vcom patterns - horizontal ( left to right ) and chess pattern

                          simplex

                          Nice. I see some usernames alluding to engineering terms or concepts from computer programming. Maybe some "monitor engineers" are lurking here in the shadows…

                            photon78s

                            Here is some results I got:

                            When I set my 165hz xiaomi panel up to 45hz ( 30…45 ), no green vcom columns anymore and you can use monitor and feel only grapfic card + windows dithering, no "dithering" added by monitor

                            https://cloud.mail.ru/public/frSj/jFbYfBr1w

                            When I set to 46hz or more (50, 60, 100, 120, 165) - vcom add its bars and eye-nerve got strain after 20+ minutes

                            https://cloud.mail.ru/public/WjBP/ga9DxqJGa

                            p.s. all vids slowed down 4 times. There is static image recorded

                            p.p.s. all modern monitors I tested have same vertical bars or another pixel sticking protection. I think pair-based vcom (first frame: left green pixel shine and right is closed, second frame: left is closed and right is shine) overload eye-nerve. My old-safe benq dont have pair vcom, it have 4-step

                            In engineer menu I found no options regarding vcom adjustments. I think it controlled with extra chip in panel board scheme, or it is t-con config

                              simplex

                              Will this vcom still be an issue even if you remove the backlight such as when doing a conversion of a regular display panel to a solar backlit display?

                                photon78s

                                I think yes, the backlight is typical - big blue peak similar to my benq, but red peak is KSF-based. Without vcom you can relax more than with vcom ( setting 120hz+ framerate )

                                a month later

                                My new findings:

                                1. If you have a severe attack ( you feel eye-strain even using safe displays for a week ), Glycerin and Ethylmethylhydroxypyridine will help reduce eye pressure. Consult a neurologist first
                                2. There could be " tail " condition ( your nerve still tensed even when irritant has gone )
                                3. GPU make some records in motherboard's nvram, so its better to clear it out ( remove motherboard CMOS battery for 30 min ) after bad GPU usage
                                4. to completly remove nvidia drivers configs, use DDU ( sometimes you can replace GPU but eye-strain can still exist due to "previos card rendering configuration" stored in windows )
                                5. software also matters, I stick with win10 1809 17763.1098 with updates blocked, my current nvidia driver is 536.23 ( check "make clean installation" when install, after that check you using 8bpc / RGB / "full colorrange" in nv control panel, it disable dithering )
                                6. using CRU you can delete monitor colordata to let windows use "default" colorspace without tranformation ( same effect which people get using HDMI-DVI or HDMI-DP cableconnectors, or switchers )
                                7. using monitor profiles or windows night mode (yellow tint shift) also activate color transformation
                                8. If steps above doesnt help, you need to replace hardware and test again. My working top comfort is z390 ( lga1151v2 ) + rtx2070s ( TU104 based ) + benq ( a month ago I bought a new benq 2420z and a used z390 + 9600kf = everything is great on 2nd PC )

                                  qb74 There is a "safe" gudeline I'd say for monitors.

                                  Not really. Just out of your guide, flicker has zero impact on me whatsoever, and refresh rate ABOVE 60hz causes me symptoms, higher resolutions are worse for me, and I need to be sub 3000k color temp, not 5000k

                                  I know there is a strong desire to find a "silver bullet" that cures us, but it doesn't exist. Each person needs to find a specific setup that works for them and then never change it.

                                  • qb74 replied to this.

                                    ensete

                                    flicker has zero impact on me whatsoever,

                                    Flicker impacts every single human being, whether they perceive it or not. All studies I've come across point to this conclusion.

                                    and refresh rate ABOVE 60hz causes me symptoms,

                                    You might've chosen monitors which have a modern wide color gamut backlight (KSF or QDEF), every monitor is a story of it's own.

                                    higher resolutions are worse for me,

                                    Same story as above, depends on panels which you've used.

                                    and I need to be sub 3000k color temp, not 5000k

                                    Yes, ideally no blue light is best but not everyone can afford using a red screen all the time.

                                    but it doesn't exist.

                                    It does, people here just seem to like going on wild goose chases.

                                    Wonder how ksf phosphor interacts with the chromostereopsis effect. Probably doesn't help the eyes focus.

                                    simplex Speaking of which running windows update services significantly affects eye pain, unexpectedly ).Even if you installed a clean Windows, but left the update services on, then on average after a day will arrive “hidden updates” which will not even be in the list of updates and you will feel pain in the eyes, experienced on myself. Just some information for "club" members ).
                                    Then, how I am struggling with "previos card rendering configuration" or a hidden Windows update. Step by step:
                                    1) I uninstall the driver in the normal way for Windows.
                                    2) I reboot into safe mode and uninstall via DDU.
                                    3) I go back to normal mode and check that the services and tasks to update widnows are stopped.
                                    4) By sequentially running the following commands:
                                    DISM.exe /Online /Cleanup-image /Restorehealth
                                    sfc /scannow
                                    5) Installing a different driver than the one I had.
                                    6) Reboot
                                    7) Repeat steps 1, 2, 3, 4.
                                    8) Install the driver you need (in my case, the one I had).
                                    9) Reboot


                                      23 days later

                                      After 1 week being without any screens except smartphone, eyes got rested

                                      When I sit in my z390+rtx2070s, pain between brows begins after 2 hours. I switched GPU to gtx1060, cleared CMOS removing battery, and pain stopped

                                      So, after switching from gtx1060 to rtx2070s, 1.5 years is time when I started to feel strain. Same time exp for my wife but with rtx3080ti card

                                      Looking GPU specs, I found gtx1060 and 1660s (which is safe according to this forum) are directX 12.1, when rtx20 and newer gen are directX 12.2 ultimate. Intel UHD630 or UHD770 or IRIS XE or Intel Arc iGPU - also 12.1, when amd 610m and newer are 12.2

                                      My question: who use dx12.2 gen graphic card without any issues? Note, dx12.2 support was introduced in win10 2004 and newer. In my theory, win10 <2004 and dx12.1 graphic cars are safest combos

                                      dev