hpst Yeah, I think it makes sense as every manufacturer is about HDR, high contrast, sharpness bla bla... And I think the hardware cannot deliver this completely yet so they need to help with software voodoo like dithering.
(yes, blue light and PWM are their own things but I think that we are in the HDR aera now where dithering/FRC plays a big role).

  • JTL replied to this.
  • hpst likes this.

    deepflame I was hoping with all these modern HDR monitors it would finally knock some sense into these companies into producing better panels, but nothings that simple... 🙁

    • hpst replied to this.

      JTL So is this in line with how you were working to stop dithering? Or if not a workable approach? Tricking the system into believing a higher quality panel is used so it doesn't think it needs to turn dithering on?

      • JTL replied to this.

        hpst I didn't even know that forcing 10-bit color could stop dithering. I was assuming that by forcing 10-bit color it would try and dither a 6-bit panel to 10-bits and fail. Or in the case of an 8-bit panel it might work using dithering to make a 10-bit image, but that's not what we want.

        It's kind of complicated but my approach was just reverse engineering and editing the driver code to stop dithering altogether. Harder then it sounds.

        • hpst replied to this.

          JTL So do you think @deepflame's experience is a one off specific to his/her hardware and apps? Or logically reproducable in general with the idea that the system is requesting a certain level of color and that lying to the system (as his setup seems to be doing by saying its a 10-bit panel in a setting even if the hardware really isn't) would work on a macro scale? How would we proceed to determine this and then how would we extrapolate this to a larger solution like telling the whole system the panel is high bit to hope for the same response the apps are giving and not asking for dithering?

          • JTL replied to this.

            hpst I don't know at the moment. I do have some ideas to test this theory in more detail though.

            • hpst replied to this.

              JTL If you know any way I could check it on a 6 bit panel running a linux distro please let me know. If there is some way to trick the OS into thinking the panel is 8-bit/10-bit etc etc to see if this stops the 6+2 dithering crap. This could be a real breakthrough tactic wise. I just don't know where linux makes this request..like is it in the various drivers? In @deepflame's case it sounds like the individual apps are requesting 10-bit and are accepting being told the panel is 10-bit by that software setting JUST for those app windows, thus not needing to dither (which I am not sure if its a Windows graphics setting or what). It's probably a lot harder than I am imagining...but it would be amazing if this was a viable way to get around dithering for now.

              • KM replied to this.

                deepflame If you are not a gamer

                But I am - and a keen flight simulation / simracing fan as well so not being able to upgrade my trusty 970 to something much more powerful is extremely maddening.

                • JTL replied to this.

                  Adding my experiences to the wider discussion here. I'm A-OK with my Plasma TV and I'm OK with my Dell 2407/2410 or similar CCFL screens on a Nvidia 970 (and AMD 6950), as well as my 2013 MacBook Pro's IPS screen.

                  Otherwise several new LCD TVs, most iPads and iPhones (haven't tried OLED), various laptops, and several new computer monitors (high refresh, low blue, AMVA, IPS), Xbox One, and 980, 980Ti, 1070, 1080 cards all give me eye strain. At this point I'm thinking wow - it's both the screens AND the hardware that's driving them. Realising it was also the video cards was really a sad moment. But then you add the new version of Chrome, Firefox and Prepar3D Flight Sim V4 that ALSO give me the same eye strain and the whole situation gets twice as convoluted, and frustrating.

                  Now, I really don't know what to make of it. Screens do it. Video cards do it. And now windows (part or full screen) can do it too. For the life of me I cannot decide if it's blue light, PWM, dithering or something else. Right now, my money's on dithering or maybe something else. And you know it's damned expensive buying gear to TRY only to have to on-sell it at a loss. Where I live we don't have easy try-before-you-buy or money back returns (except Apple - which I've taken up on a couple of iPad tests).

                  And yes I've had my eyes tested, blah blah blah, and I counter that by saying - "I can watch/use my older hardware for hours with no problems at all - so it's not my eyes!"

                  Personally, all I really want right now is to be able to go out and buy a more powerful GPU for my flight sims - I can live with 24" screen(s). But then if I ever change job... I guess I'll need to buy this MacBook Pro 2013 that I'm happily using - from my current employers.

                    AgentX20 But then you add the new version of Chrome, Firefox and Prepar3D Flight Sim V4 that ALSO give me the same eye strain

                    Even on your otherwise safe setups? Just those apps specifically strain you but the rest is ok?

                      hpst Even on your otherwise safe setups? Just those apps specifically strain you but the rest is ok?

                      Yes, exactly.

                        AgentX20 A equivalent Quadro should be similar in performance to the GeForce card. My K4000 is about the same as a 660/750 in CS:GO on Linux for instance.

                        AgentX20 I wonder what's going on there...if its app/site dithering or something else. On my super old laptop which is myonly safe device those apps/sites don't strain me even though I can definitely tell they "look" different in some odd way now. Some sort of smooth/shiney/glowing effect that wasn't there before. Maybe my laptop is too old to do the dithering tricks or to do them as intensely so its just not bothering me. I don't have ANY modern hardware that doesn't suck so can't even compare sites/apps as everything on them strains.

                        hpst I just don't know where linux makes this request..like is it in the various drivers?

                        Monitors can report their bits as part of their EDID info they send to the computer. It can be faked. However, I'm not sure if this approach keeps blocking/faking the true EDID forever while the system is running. And it is a lot to read. I tried reading the monitor EDID some years ago and saw that my Dell U2515H reported true 8 bit while the BenQ EW2440L didn't report anything regarding his color capabilities. If software/drivers always respect that info is not clear. Some cards probably ignore it and always dither.
                        https://wiki.archlinux.org/index.php/kernel_mode_setting#Forcing_modes_and_EDID

                          KM @JTL I am clearly out of my depth but in @deepflame's situation he said a software setting on the GPU was set to "10-bit mode"...and even though the laptop's panel wasn't 10-bit this fooled the apps requesting it into believing they didn't need to dither to get that 10-bits so it sounds like the apps trusted the gpu and not the display?

                          IF trickery would even work universally, rather than having to make hundreds or more of individual panels to report false capability back to whatever is requesting that quality, my thought (again based in very shallow ideas and no programming ability so might be nonsense) was could we somehow come from the other direction and extrapolate what his gpu setting was doing (fooling the requesting apps so they don't bother to dither as they don't think they need to to achieve the 10 bits) to a higher level global behavior so when whatever it is in the OS that says "show the desktop and file system and everything etc in this quality and dither to get that quality if you can't natively do it" (just like the apps in his case, and which has to be happening somewhere in software since in just a bog standard distro makes 6 bit laptop panels dither to more colors right?) that we get a system wide change rather than app by app.

                          Is that just obviously stupid and I am too ignorant to how everything comes together for this to even be possible? What is it in linux for example that says what quality to display the graphical DE in and triggers dithering to get it? Is it the gpu driver? Xorg? As far as I know laptop panels are "dumb" and aren't making that choice internally like some desktop monitors can right?

                          • JTL replied to this.

                            hpst @JTL I am clearly out of my depth but in @deepflame's situation he said a software setting on the GPU was set to "10-bit mode"...and even though the laptop's panel wasn't 10-bit this fooled the apps requesting it into believing they didn't need to dither to get that 10-bits so it sounds like the apps trusted the gpu and not the display?

                            I don't have the same software & hardware to test in front of me so I don't know what his setup is doing.

                            hpst Is that just obviously stupid and I am too ignorant to how everything comes together for this to even be possible? What is it in linux for example that says what quality to display the graphical DE in and triggers dithering to get it? Is it the gpu driver? Xorg? As far as I know laptop panels are "dumb" and aren't making that choice internally like some desktop monitors can right?

                            No. @deepflame's experiment is the first I've heard of this, but I'd need the same hardware in front of me in order to understand what it's doing and if I can port it to be more universal.

                            • hpst replied to this.

                              JTL I believe it is a Thinkpad W500? If you don't need one in awesome condition that's under 100usd on eBay. If nobody else would pitch in a few bucks toward it, I'd be willing to just order it for you myself if you believe it could at least rule this in or out as an option since it/along with your work/ are the only actionable ideas I have seen. I can eat beans for a while if it could even possibly lead to a breakthrough.

                              • JTL replied to this.

                                hpst I want to get the capture card stuff under control first which requires some free time from other duties to write the image capture/comparison program (easy, don't worry)

                                Thanks though. Means a lot to me.

                                  JTL Sorry for being pushy or overeager sounding. I'm desperate as you know so I tend to get intense and a burst of "hope energy" when I think there is a possibility in the darkness. If you know anyone else who would be good to approach for this particular angle please let me know. I am still interested in contributing to the card as well since any solution is a godsend...and I am surely missing something about this idea as things are never as easy as toilet logic makes them sound.

                                  KM

                                  KM Some cards probably ignore it and always dither.

                                  s/some/almost all

                                  If they respected the EDID. I think for the most part we wouldn't have this issue.

                                  • KM likes this.
                                  dev