qb74 do you have an example of this fast oscilloscope with light probe? And/or guides on how to make it and how to test the flickering with it?

  • qb74 replied to this.

    qb74 it's very interesting! Don't you find the Opple Light Master device acceptable for those measurements? It's cheaper and can detect hi-frequency light changes

      4 days later

      Ivan_P
      Until I see a disassembly of it, along with a specsheet of it (since it's basically emulating a oscilloscope + light probe setup, characteristics such as bandwidth of probe and device itself, shape of frequency response etc.), I don't consider it a valid option for comparisons.

      Ideally, one would disassemble panels and probe the backlight itself for brightness fluctuations, but this is not possible for most.

      Ivan_P Opple and Radex Lupin are both good enough.

      Any cheap handheld scope + alligator clips + a $0.10 led driver that is responsive down to ns will work. You can splurge for a dedicated photodetector for $1.50-$25. Mouser sells everything.

      Dithering requires a microscope.

      • qb74 replied to this.

        simplex

        1. Software also matters, I'm using Win10 1809 17763.1098 with updates blocked

        You wrote earlier:
        "Here is my safe workstation: 9600kf/Gigabyte z390d(F2)/Asus 2070s(466.63)/win10 LTSC 1809 build 17763.1098"
        if Windows version is so important, can you specify some subtleties:
        1) Windows 1809_LTSC or 1809_"regular" ?
        2) build 17763.1098 is obtained immediately after a "clean" installation?
        (if so, could you please write the full name of the .iso image and its MD5/SHA checksum)

        I can't find the original Microsoft LTSC_1809_build_17763.1098 image
        I'm installing the initial LTSC_1809(build_17763.316) + kb4538461(Cumulative Update March 2020) = LTSC_1809_build_17763.1098
        I don't know that these are equivalent actions...

        what CPUID does your 9600kf have? 906EC or 906ED? (you can look in AIDA64 https://download.aida64.com/aida64engineer735.zip)

        what is the MICROCODE version AFTER loading Windows? (you can also look in AIDA64)

        1. Using CRU, you can remove the monitor's color data to allow Windows to use the "default" color space without transforming.

        Can you elaborate a bit on "remove monitor color data"? Is this related to edit EDID info ?

        When I sit in my z390+rtx2070s, pain between brows begins after 2 hours. I switched GPU to gtx1060, cleared CMOS removing battery, and pain stopped

        Do I understand correctly that z390+2070s is easier on the eyes than z690+2070s(same)? But z390+1060 is more easier than z390+2070s? Have the VGA drivers version been changed?
        What is the full name of the 1060 model(+BIOS ver)?

          moonpie
          How can you claim the Opple and Radex are "good enough" when there isn't a single datasheet or spec sheet of the probes used and of the scope / device itself? There aren't any teardowns of both devices either.
          You can use a screwdriver to hammer down a nail as well. Something may work, but is not optimal for the task.
          "Any cheap handheld scope" is not a optimal setup for properly evaluating backlight brightness stability.
          Dithering testing requires a oscilloscope + probe as well, microscope doesn't tell us anything.

            glvn 1) Windows 1809_LTSC or 1809_"regular" ?
            2) build 17763.1098 is obtained immediately after a "clean" installation?

            I am using customized ISO image which I got in 2021y, I suppose its base Windows 10 Enterprise LTSC (x64) Build 17763 with update which makes 17763.1098 version, so your way LTSC_1809(build_17763.316) + kb4538461 looks correct

            glvn what CPUID does your 9600kf have?

            My 9600kf have 906ECh cpuid and AEh microcode, I can check same for my 2nd machine a bit later, coz 2nd machine is also safe

            glvn Is this related to edit EDID info ?

            True, I found my xiaomi mi 27 2k used some EDID tech to made gamma brighter, thats why I removed all extension blocks in CRU (bottom part of main windows) using DP cable and add custom resolution in first block in same main window ( coz ext.block contain detail resolution too ). But if you use HDMI connection, you need to add ext.block again using HDMI 2.0 support string with default settings and add extra resolutions in upper list.

            After that I found my benq with DVI connection is very calm and after CRU checking, found all blocks are empty, so using DP cable I reproduse DVI condition to get calm monitor behavior

            glvn z390+1060 is more easier than z390+2070s

            Yes. 2070s (and 2080s) is tolerable, when 1060 is not stressful at all. I sold z690 cos it strained eyes more than z390 keeping same grapfic cards/drivers. Not one of my grapfic card were calm with z690 - even gtx1060 not

            My laptop with gtx1060 is also very comfortable and calm ( asus gl702vm, 7300hq, gtx1060, 6 bit LP173WF4-SPF3 panel ) with same win10 build and 461.67 nvidia drivers

            glvn Have the VGA drivers version been changed?
            What is the full name of the 1060 model(+BIOS ver)?

            No, I tested with same 466.63 drivers - after GC change, I reset BIOS and clear nvram memory removing CMOS battery for 1 hour, then deinstall drivers via DDU in win safe mode, but safe mode not necessary. When installing new drivers, I choose only display driver + psyX and check "clean install" option

            100% safe grapfic card with giga z390d (F2 bios of 2019.10.15): white color asus gtx1060 dual, GC BIOS is 86.06.0E.00.41, connection via HDMI to DVI cable

            Here is my CRU screenshot - no bit depth, no extension block

            You know, I exported all registry settings to find difference after changing 2070s to 1060 and found nothing (except GC name etc), I also checked directX settings in registry - nothing. All registry parameters are quite same.

            The issue is how grapfic card render image, how GC apply sharpness/noise reduction. I thought issue is in dx12.2 features which able to made image more better in terms of "depth of view" etc, but nowadays my theory is all new GC use upscaling or other tech which requred smoothing (dithering ) and extra pixel movements

            1660s which is safe according to this forum have same rtx20 architecture (turing), but dont have rtx/dlss

              Staycalmsyndrome

              someone has to slow down progress, shake the air and say "this is all in vain, nothing will work" 🙂 this role is busy in this forum

              qb74
              They came close enough to the scopes and photodiodes that did come with datasheets. Were they as accurate? Nope. Were they close enough? Yep.

              https://www.mouser.com/ProductDetail/ams-OSRAM/BPW-34-S?qs=vLWxofP3U2zVT0CDmWwS1A%3D%3D
              https://www.mouser.com/datasheet/2/588/asset_pdf_5173751-3418818.pdf
              $0.86
              vs
              $400+ for a photodiode, battery, and resistors in a premade probe.

              A microscope literally tells you everything you need to know about dithering.

              glvn

              I have added empty data, CRU screenshots, feel free to ask any details!

              I also tested whole week miniPC (13700h + Iris XE 96EU) with same benq HDMI-DVI-D cable, seems okay, similar to z390d+gtx1060:

              1. win10pro 21h2 19044.1889 windows feature experience pack 120.2212.4180.0, stopped updates
              2. Intel Grapfics driver 31.0.101.5522 (default settings)

              I plan to test win10pro 19045.4780 ( updated upto august'24 ) in miniPC next week

              4 days later

              simplex

              I switched GPU to gtx1060, cleared CMOS removing battery, and pain stopped

              after switching. besides the feeling of pain, do you see any visual difference in the image, banding for example?

              My 9600kf have 906ECh cpuid and AEh microcode,

              even earlier you wrote:
              "the build number is very important: 1809 LTSC 17763.1098, the next 2 processor updates (kb4589208 and kb5035849) will bring back your eye strain"
              https://ledstrain.org/d/2712-windows-with-colorcontrol-still-makes-me-eye-strain/13

              bios F2 contains MicroCode 0xAE for 906EC but Windows can update MicroCode
              (in particular, kb4589208 updates MicroCode to 0xCA for 906EC)

              my guess: if one installation of kb4589208 ONLY adds a eye strain, perhaps the MicroCode version is ONE of the factors ??

              After that I found my benq with DVI connection is very calm and after CRU checking, found all blocks are empty, so using DP cable I reproduse DVI condition to get calm monitor behavior

              If I understood you correctly, the same monitor connected via DP causes less eye strain with the EDID modification than the same monitor/DP but without the EDID modification?

              100% safe grapfic card with giga z390d (F2 bios of 2019.10.15): white color asus gtx1060 dual, GC BIOS is 86.06.0E.00.41, connection via HDMI to DVI cable

              HDMI on VGA side ?
              in nvidia control panel such connection is displayed as DVI or HDMI?
              I'm trying to understand: does eye strain depend on the type of monitor interface (in a potentially safe combination z390/1060/17763.1098)

              I found gtx1060 and 1660s (which is safe according to this forum)

              unfortunately not all 1660s are safe (

              https://ledstrain.org/d/1524-eye-strain-with-new-gpu/79
              "Before that i use gtx 1660s inno3d twinx2 gddr6 SAMSUNG and this one is the best. No eyestrain at all.
              Then i use gtx 1660s inno3d twinx2 gddr6 micron and hyenix and make me feel eyestrain."

              there was also a discussion on this forum about the safe variety of 970, the general conclusion that I remember is that only one specific variety is safe (Gigabyte G1 Gaming 970 version 1.0) not all Gigabyte 970 (

              I also tested whole week miniPC (13700h + Iris XE 96EU) with same benq HDMI-DVI-D cable, seems okay, similar to z390d+gtx1060

              i5 12600k + UHD 770 ) - all is OK

              is there a difference in eye strain between these configurations?

              You know, I exported all registry settings to find difference after changing 2070s to 1060 and found nothing

              and have you experimented with DitherRegistryKey in registry ?

                glvn visual difference in the image

                Cant say regarding visuals between rtx20 and gtx1060, but:

                1. z390d + gtx1060 and 13700h + Iris XE looks same in term of banding in same monitor
                2. rtx3080 or 3080ti (already sold) looks more brighter, more "smooth" or ?noise-reducted? But this fact is for z690d chipset, ( z690d + UHD770 iGPU looks as I wrote above )
                3. I also recorded z690/z390 and iGPU/2070s with same monitor for camera (4k60p) - no visual difference at all, same "pixel walking" which is monitor's FRC

                glvn perhaps the MicroCode version

                Maybe. After I tested win10 21h2 and last win10 22h2 without any discomfort in 13700h + Iris XE, I will plan to test same win build in z390d + gtx1060

                glvn less eye strain with the EDID modification than the same monitor/DP but without

                True, tested with xiaomi and benq monitors

                glvn HDMI on VGA side ?

                yes, HDMI is in grapfic card side, DVI-D is monitor's ( it also accept DP, no HDMI ports in monitor )

                glvn in nvidia control panel such connection is displayed as DVI or HDMI?

                32 bit DVI connection - no options to choose bit depth, 4:2:2 sampling which DP brings (HDMI limitations I suppose)

                glvn is there a difference in eye strain between these configurations?

                my wife told, z690d + 12600k is strain free, but comparing z390+1060 (win10 1809) vs 13700h+IRIS (win10 22h2) I think they are both comforty

                glvn experimented with DitherRegistryKey in registry ?

                not only, but yes. Only this keys changing change nothin, ColorControl app activate more win options after that I can change values you wrote and get insta result. More simpler to use ColorControl for that

                I also experimented with some difference nvidia values without success also, "DmaRemappingCompatible"=dword:00000003 added in 472.12 for 2070s ( not available in 466.63 )

                the main difference btw 1060 and 2070s:

                gtx1060 has

                [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\nvlddmkm\Global\NvHybrid\Persistence\ACE\HDR] "BrightEv"=dword:00000000 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\nvlddmkm\Global\Startup\StartS3SR] @=dword:00000001

                "SaturationRegistryKey"=dword:00000032

                2070s has:

                [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\nvlddmkm\Global\NvHybrid\Persistence\ACE\PFF\256] "CurveType"=hex:01,00,00,00,01,00,00,00,01,00,00,00 "FrequencyHz"=hex:80,0e,80,69,80,0e,80,69,40,5b,aa,5f "Temperature"=hex:00,53,00,00,00,55,00,00,00,57,00,00 "ThermalLimit"=hex:00,00,a6,42 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\nvlddmkm\Parameters] "DmaRemappingCompatible"=dword:00000003

                Also directX limit values ( 12.2 or 12.1 for example ) in registry also not matters - rtx20 has same but gives pain

                glvn

                100% safe grapfic card with giga z390d (F2 bios of 2019.10.15): white color asus gtx1060 dual, GC BIOS is 86.06.0E.00.41, connection via HDMI to DVI cable

                I forgot to clarify, what brand of memory manufacturer is on your 1060? (Samsung. Hynix, Micron ...)

                  glvn

                  gtx1060 - GDDR5 Samsung

                  rtx2070s - GDDR6 Micron

                  rtx2080s - GDDR6 Samsung

                  rtx3080, 3080ti - GDDR6X Micron

                  9 days later

                  I changed the RAM in safe mini-pc (13700h + Iris Xe 96EU) and got eye-strain

                  The 13700h processor supports upto 5200 MHz DDR5

                  KVR48S40BD8K2-64 was installed (2x32gb, 4800 CL40, ?micron? chips) - no issues

                  I installed KF556S40IB-16 (2x16gb, 5600 CL40, hynix modules) which worked at 5200 CL38 - discomfort

                    simplex
                    I have a weird theory which you could try.
                    Could you try and run RAM testing software, such as Karhuu, MemTestPro, TM5 and see if you're stable in them for +16h?
                    Or if possible, use the same RAM but lower the frequency to 4800MHz.
                    You might be, as the overclock community puts it, "unstable" and it might influence your smoothness perception during use.

                      qb74 +16h?

                      too much, dude %)

                      mini-PC cant use edit timings, but my safe PC z390d+gtx1060 - can

                      I inserted rtx2070s ( which gives me strain after 2hours ~month ago after that I switched it to gtx1060 and get calm )

                      First of all, I test does CPU overclock matters, setting default (3.7ghz), then Gaming profile (4.4ghz), then Advanced profile (4.5ghz) - nothing has changed

                      Then I tested how DDR4 memory works in my z390 by default - at 2133mhz (15-15-15-36). I cant say it is paperlike, but 100% not strainy as xmp profile ( 3200 CL16-16-18-36 ) did

                      Then I set 3000mhz with defaulty (MB calc that) timings 21-22-22-50 - it was same as 2133

                      Then I set 20-18-18-36 keeping same 3000mhz - a bit worse

                      Then 16-18-18-36 @ 3000 - definently bad

                      Okay, I set 3200mhz as max supported memory freq - it was also good with 22-23-23-53

                      Then 20-18-18-36 - not as good

                      Then 18-18-18-36 - bad

                      Finally, I continue testing my win10 with auto-timings by MB (3200 22-23-23-53)

                      Here is bad timinigs causing eye-strain with 2070s:

                      Here is good, calc by MB (not paper but…. okay? still testing)

                      Imagine if memory can impact

                        simplex

                        Imagine if memory can impact

                        RAM can impact systems in ways most don't comprehend.
                        It can cause microstutters, which you can easily be experiencing, thus leading to your eyestrain
                        Think of it like this (in a rudimentary way):
                        XOC (term for extreme overclock - used to indicate unstable RAM behavior) => microstutter => need to refocus more often => eyestrain
                        (this is just a theory for the eyestrain part)

                        Your testing seems to somewhat confirm this even further.
                        Keep in mind, system can get corrupted if your RAM is unstable, which could (potentially) further increase eyestrain.

                        RAM at XMP can be unstable if paired with a mediocre motherboard and CPU IMC, You're only relatively "safe" at CPU-vendor specified speeds (which gets very hazy in DDR5-era CPUs)

                        Would be funny if people here have eyestrain due to unstable RAM.

                        EDIT: Would it be possible for you to use ASROCK Timing Configurator or MemTweakIt to showcase your entire timing list? (primary, secondary, tertiary)

                        Link for first utility: https://download.asrock.com/Utility/Formula/TimingConfigurator(v4.1.0).zip

                          dev