glvn

I have added empty data, CRU screenshots, feel free to ask any details!

I also tested whole week miniPC (13700h + Iris XE 96EU) with same benq HDMI-DVI-D cable, seems okay, similar to z390d+gtx1060:

  1. win10pro 21h2 19044.1889 windows feature experience pack 120.2212.4180.0, stopped updates
  2. Intel Grapfics driver 31.0.101.5522 (default settings)

I plan to test win10pro 19045.4780 ( updated upto august'24 ) in miniPC next week

4 days later

simplex

I switched GPU to gtx1060, cleared CMOS removing battery, and pain stopped

after switching. besides the feeling of pain, do you see any visual difference in the image, banding for example?

My 9600kf have 906ECh cpuid and AEh microcode,

even earlier you wrote:
"the build number is very important: 1809 LTSC 17763.1098, the next 2 processor updates (kb4589208 and kb5035849) will bring back your eye strain"
https://ledstrain.org/d/2712-windows-with-colorcontrol-still-makes-me-eye-strain/13

bios F2 contains MicroCode 0xAE for 906EC but Windows can update MicroCode
(in particular, kb4589208 updates MicroCode to 0xCA for 906EC)

my guess: if one installation of kb4589208 ONLY adds a eye strain, perhaps the MicroCode version is ONE of the factors ??

After that I found my benq with DVI connection is very calm and after CRU checking, found all blocks are empty, so using DP cable I reproduse DVI condition to get calm monitor behavior

If I understood you correctly, the same monitor connected via DP causes less eye strain with the EDID modification than the same monitor/DP but without the EDID modification?

100% safe grapfic card with giga z390d (F2 bios of 2019.10.15): white color asus gtx1060 dual, GC BIOS is 86.06.0E.00.41, connection via HDMI to DVI cable

HDMI on VGA side ?
in nvidia control panel such connection is displayed as DVI or HDMI?
I'm trying to understand: does eye strain depend on the type of monitor interface (in a potentially safe combination z390/1060/17763.1098)

I found gtx1060 and 1660s (which is safe according to this forum)

unfortunately not all 1660s are safe (

https://ledstrain.org/d/1524-eye-strain-with-new-gpu/79
"Before that i use gtx 1660s inno3d twinx2 gddr6 SAMSUNG and this one is the best. No eyestrain at all.
Then i use gtx 1660s inno3d twinx2 gddr6 micron and hyenix and make me feel eyestrain."

there was also a discussion on this forum about the safe variety of 970, the general conclusion that I remember is that only one specific variety is safe (Gigabyte G1 Gaming 970 version 1.0) not all Gigabyte 970 (

I also tested whole week miniPC (13700h + Iris XE 96EU) with same benq HDMI-DVI-D cable, seems okay, similar to z390d+gtx1060

i5 12600k + UHD 770 ) - all is OK

is there a difference in eye strain between these configurations?

You know, I exported all registry settings to find difference after changing 2070s to 1060 and found nothing

and have you experimented with DitherRegistryKey in registry ?

    glvn visual difference in the image

    Cant say regarding visuals between rtx20 and gtx1060, but:

    1. z390d + gtx1060 and 13700h + Iris XE looks same in term of banding in same monitor
    2. rtx3080 or 3080ti (already sold) looks more brighter, more "smooth" or ?noise-reducted? But this fact is for z690d chipset, ( z690d + UHD770 iGPU looks as I wrote above )
    3. I also recorded z690/z390 and iGPU/2070s with same monitor for camera (4k60p) - no visual difference at all, same "pixel walking" which is monitor's FRC

    glvn perhaps the MicroCode version

    Maybe. After I tested win10 21h2 and last win10 22h2 without any discomfort in 13700h + Iris XE, I will plan to test same win build in z390d + gtx1060

    glvn less eye strain with the EDID modification than the same monitor/DP but without

    True, tested with xiaomi and benq monitors

    glvn HDMI on VGA side ?

    yes, HDMI is in grapfic card side, DVI-D is monitor's ( it also accept DP, no HDMI ports in monitor )

    glvn in nvidia control panel such connection is displayed as DVI or HDMI?

    32 bit DVI connection - no options to choose bit depth, 4:2:2 sampling which DP brings (HDMI limitations I suppose)

    glvn is there a difference in eye strain between these configurations?

    my wife told, z690d + 12600k is strain free, but comparing z390+1060 (win10 1809) vs 13700h+IRIS (win10 22h2) I think they are both comforty

    glvn experimented with DitherRegistryKey in registry ?

    not only, but yes. Only this keys changing change nothin, ColorControl app activate more win options after that I can change values you wrote and get insta result. More simpler to use ColorControl for that

    I also experimented with some difference nvidia values without success also, "DmaRemappingCompatible"=dword:00000003 added in 472.12 for 2070s ( not available in 466.63 )

    the main difference btw 1060 and 2070s:

    gtx1060 has

    [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\nvlddmkm\Global\NvHybrid\Persistence\ACE\HDR] "BrightEv"=dword:00000000 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\nvlddmkm\Global\Startup\StartS3SR] @=dword:00000001

    "SaturationRegistryKey"=dword:00000032

    2070s has:

    [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\nvlddmkm\Global\NvHybrid\Persistence\ACE\PFF\256] "CurveType"=hex:01,00,00,00,01,00,00,00,01,00,00,00 "FrequencyHz"=hex:80,0e,80,69,80,0e,80,69,40,5b,aa,5f "Temperature"=hex:00,53,00,00,00,55,00,00,00,57,00,00 "ThermalLimit"=hex:00,00,a6,42 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\nvlddmkm\Parameters] "DmaRemappingCompatible"=dword:00000003

    Also directX limit values ( 12.2 or 12.1 for example ) in registry also not matters - rtx20 has same but gives pain

    glvn

    100% safe grapfic card with giga z390d (F2 bios of 2019.10.15): white color asus gtx1060 dual, GC BIOS is 86.06.0E.00.41, connection via HDMI to DVI cable

    I forgot to clarify, what brand of memory manufacturer is on your 1060? (Samsung. Hynix, Micron ...)

      glvn

      gtx1060 - GDDR5 Samsung

      rtx2070s - GDDR6 Micron

      rtx2080s - GDDR6 Samsung

      rtx3080, 3080ti - GDDR6X Micron

      9 days later

      I changed the RAM in safe mini-pc (13700h + Iris Xe 96EU) and got eye-strain

      The 13700h processor supports upto 5200 MHz DDR5

      KVR48S40BD8K2-64 was installed (2x32gb, 4800 CL40, ?micron? chips) - no issues

      I installed KF556S40IB-16 (2x16gb, 5600 CL40, hynix modules) which worked at 5200 CL38 - discomfort

        simplex
        I have a weird theory which you could try.
        Could you try and run RAM testing software, such as Karhuu, MemTestPro, TM5 and see if you're stable in them for +16h?
        Or if possible, use the same RAM but lower the frequency to 4800MHz.
        You might be, as the overclock community puts it, "unstable" and it might influence your smoothness perception during use.

          qb74 +16h?

          too much, dude %)

          mini-PC cant use edit timings, but my safe PC z390d+gtx1060 - can

          I inserted rtx2070s ( which gives me strain after 2hours ~month ago after that I switched it to gtx1060 and get calm )

          First of all, I test does CPU overclock matters, setting default (3.7ghz), then Gaming profile (4.4ghz), then Advanced profile (4.5ghz) - nothing has changed

          Then I tested how DDR4 memory works in my z390 by default - at 2133mhz (15-15-15-36). I cant say it is paperlike, but 100% not strainy as xmp profile ( 3200 CL16-16-18-36 ) did

          Then I set 3000mhz with defaulty (MB calc that) timings 21-22-22-50 - it was same as 2133

          Then I set 20-18-18-36 keeping same 3000mhz - a bit worse

          Then 16-18-18-36 @ 3000 - definently bad

          Okay, I set 3200mhz as max supported memory freq - it was also good with 22-23-23-53

          Then 20-18-18-36 - not as good

          Then 18-18-18-36 - bad

          Finally, I continue testing my win10 with auto-timings by MB (3200 22-23-23-53)

          Here is bad timinigs causing eye-strain with 2070s:

          Here is good, calc by MB (not paper but…. okay? still testing)

          Imagine if memory can impact

            simplex

            Imagine if memory can impact

            RAM can impact systems in ways most don't comprehend.
            It can cause microstutters, which you can easily be experiencing, thus leading to your eyestrain
            Think of it like this (in a rudimentary way):
            XOC (term for extreme overclock - used to indicate unstable RAM behavior) => microstutter => need to refocus more often => eyestrain
            (this is just a theory for the eyestrain part)

            Your testing seems to somewhat confirm this even further.
            Keep in mind, system can get corrupted if your RAM is unstable, which could (potentially) further increase eyestrain.

            RAM at XMP can be unstable if paired with a mediocre motherboard and CPU IMC, You're only relatively "safe" at CPU-vendor specified speeds (which gets very hazy in DDR5-era CPUs)

            Would be funny if people here have eyestrain due to unstable RAM.

            EDIT: Would it be possible for you to use ASROCK Timing Configurator or MemTweakIt to showcase your entire timing list? (primary, secondary, tertiary)

            Link for first utility: https://download.asrock.com/Utility/Formula/TimingConfigurator(v4.1.0).zip

              Ok I will look at them

              the facts:

              1. DDR4 3200@16-18-18-36 + 1060 (GDDR5) = good
              2. DDR4 3200@16-18-18-36 + 2070s (GDDR6) = you have to be patient
              3. DDR4 3200@22-23-23-53 + 2070s (GDDR6) = not paper but dont need patient
              4. DDR4 3200@16-18-18-36 + 3080 (GDDR6X) = bad
              5. z690 DDR4 3200@16-18-18-36 + 3080 (GDDR6X) = bad

              win10 share RAM memory with GPUs. Perhaps, very different GPU and RAM memory speed/timings cause strain.

              1. How to determine GPU timings to adjust RAM closer, to check theory?
              2. How to disable GPU ability to eat/share RAM ?
              • qb74 replied to this.

                qb74 MemTweakIt

                Current ?safe? timings determined by MB

                simplex
                Would you be able to list your whole PC hardware setup (peripherals included) and software stack? (GPU driver, windows version) i'm blind, uve done so in OP
                Which power supply are you using? Are you still using the hardware you've specified in OP? Which peripherals are you using?
                When was the last time you reinstalled your OS cleanly?

                GPU DRAM Timings have long since been locked down on Nvidia side.

                  qb74 Which power supply are you using?

                  before z690 sold, I tried another 750w PSU, then my 500w PSU which currently works with z390d - nothing change

                  OS installed in feb.2024 with blocked updates, 2x sata SSD + GPU + 1x m2 SSD available only

                  moonpie which can kick in VRR more often

                  in my BIOS (2019y), OS (2020 may), drivers (mid of 2021) - there is no VRR

                  It would be joke if DDR5@4800 dont have such things. Last laptop I tested was huawei d16 2024 with 13900h and IRIS xe (same as in mini-PC) but caused me strain after 20 min. Laptop also used OC ram - DDR4X @ 4266mhz

                  moonpie wow I didn't know this. I remember someone a long time ago here said that running a system in 32bit mode or single channel ram would help strain which he thought was due to certain directX stuff not being enabled under those conditions.

                  moonpie That's good to know. I'm assuming that applies to arc gpus too? I'll share this with someone I know who has issues with xe.

                  a month later

                  Guys, need advice.

                  My current safe asus laptop has lp173wf4-spf3 panel which I use at 80% brightness. After 11k hour of usage, it lost red color and got blue-green tint, so finally I found exact same new panel - lp173wf4-spf5. But first time I replaced it, I feel…. eye-strain ( not headache )

                  I measure it with Opple4 and found it has same modulation % depth and same brightness at 61% of brightness, keeping same results I have at 80% in spf3

                  spf3 at 80%

                  spf5 at 61%

                  spf3 at 90%

                  spf5 at 90%

                  Despite of spf5 have DC dimming, trinagles told it have ~1000hz smoothed PWM (4 triangles in 4 ms period)

                  Using same logic, spf3 have ~200hz PWM

                  The question is, is it possible to adapt for new panel? If I wait more few days, I can loose return period.

                  Each moment I tried to use SPF5, felt slight burning in the eyes, brghtness level doesnt matter. When I replace panel to SPF3, all became good instantly. Is is my nervous adaptation to the old panel that I have been using since 2018?

                  p.s. panels use same EDID. Datahseets told they 99% equal (same PWM freq limits and duty ratios, contrast, colors etc)

                  dev