• AbstractOS
  • Eyestrain when switching from Windows to Linux

whystrainwhy

Yes, in my case I observed less pixel flicker with the CG2700X than the CS2740 but both had matte texture that I didn't like. Perhaps the wide gamut aspects also contributed in negative way (stuff like KSF phosphor). Unfortunately, I did not get a chance to test their flicker or color spectrum as much as I would like. Hardware and software matter. Perhaps the Arc gpu and using some older windows 10 versions would make a difference for you. I think adaptation is strongly affecting one's experiences so I decided to keep the LG monitor with its high refresh rate as it felt similar to what I was used to in my laptop screen (lenovo legion 7i with 240Hz). Best would be to demo the monitor in store if possible.

https://ledstrain.org/d/2849-lenovo-t480-yes-win10-1809-downgrade-matters-just-like-others-say-here/18

whystrainwhy Save yourself some time, just focus on the compositors first. On, off, different rendering backends, different compositor brands. You also have the issue of video drivers as well.. proprietary, oss, modesetting, etc.

    Sunspark I appreciate that. I actually have an older AMD video card lying around that used to be fine with Windows 10 before I upgraded to my 3080 TI so I'm going to put together something small and inexpensive just to test things out without messing up my current working configuration.

    I do wonder if Apple or Microsoft will eventually fix their dithering issues or whatever is causing people issues in their operating systems. I know that there are some people on here who have been struggling since 2017 (and probably before that). I also wonder if true 10 bit displays would effectively nullify the dithering issue, assuming GPU manufacturers stopped implementing those algorithms in the event that they detect a true 10 bit display.

      whystrainwhy I do wonder if Apple or Microsoft will eventually fix their dithering issues

      not sure

      whystrainwhy true 10 bit displays

      Someone here tried "true 10 bit monitor", no changes

      2 months later

      Ok, so I'm back with a grim update, this time unrelated to linux.

      I recently sold my HP Zbook laptop and tried moving to an inferior machine, which is Lenovo B570 with integrated Intel Graphics. I upgraded the CPU to i5 (it had Pentium B950, barely usable) and added more RAM. To my unpleasant surprise, I wasn't able to use it, because guess what… eye-strain. And this is on Win 7!

      First I thought it's the screen (it got one of the worst panels), but then I tried an external display (NEC EA190M, CCFL) and it was the same exact thing. I tried, I really did for several days to soldier through it, blinking my eyes constantly and finally only peeking at the screen with one or the other eye briefly because it's f…ng unbearable. I was almost convinced it's probably something with my eyes now, but… everything changed when as a final and hopeless attempt, I plugged the same exact display into my old ass Athlon-based desktop with no less ancient Nvidia GT 240. And voila, all good. Been using it up till now and typing this post on it.

      Apparently, the rumors about Intel Graphics aren't unfounded. I thereby confirm, it is unusable to people like us. Incidentally, I had used this Lenovo laptop on and off before (it's not really mine, so all I did is fixing and installing stuff), but evidently it wasn't long enough for me to detect the problem. So, it seems, what we're facing here is not just a local linux issue but a major multilayered disaster, which, as the time goes by, only becomes worse and ultimately unavoidable.

      I'm convinced it's not PWM-related (I checked the screen with my phone, no flicker) nor dithering (I mean seriously, has playing with it solved anybody's issues?). It's something to do with acceleration, which must be producing some extremely elusive side-effect. One thing I noticed though is that when connecting to this laptop the picture seems… a little sharper? Perhaps the gamma is a bit off, like darker things are just a tiny bit darker here. Anyhow, I'm seriously concerned about the future now. Like, now I'm considering buying a new laptop. What the heck am I to buy if all of it is equally unacceptable? It's ubiquitous! Like, you can't even tell anymore if it's the GPU or the display or the OS/drivers or actually a combo, where at least one of these things must be checked anyway. This is no longer a lottery yo, it's just flat out dead end.

      I don't know about you guys, but I'm seriously getting bugged by this crap now. I value my time and my vision, and if this is not resolved in the following 1-2 years, I am going to have to leave the modern world behind and stick with older machines for good while fondly embracing their limitations. At the end of the day, they may not be very powerful and will be eventually disconnected from the web along with obsolete Windows versions and whatnot, but one thing they will always be good at is MAKING SENSE. And elusive nonsense of a purely commercial kind is something I'm growing very much allergic to. I don't see any point adapting to it nor spending all of my free time looking for workarounds for all sorts of bs. I'd rather just stop using it altogether and help those assholes go bankrupt. Besides, there isn't much to do on the internet anymore, the world is going to shit anyways. I sympathize with those who need it for work, though. Try finding another job maybe unless this one heck of a funride is really your calling.

      Sorry, had to vent.

        Pudentane leave the modern world behind and stick with older machines for good while fondly embracing their limitations

        the best strategy is just to use older laptops as "monitors" and remote desktop into modern hardware.

        I use NoMachine on a super comfy 2012 Windows 8 laptop, screensharing into a M1 Mac for nearly a year now and it's working great 🙂

        RealVNC viewer is also good if you need lossless instead of lossy compression

        Pudentane when connecting to this laptop the picture seems… a little sharper? Perhaps the gamma is a bit off, like darker things are just a tiny bit darker

        very certain that Intel is doing some type of contrast/edge enhancement or sharpening filter.


        evidence actually dates all the way back to 2013 and driver version 9.x (look at the image attached to this post)

        https://community.intel.com/t5/Graphics/HD-3000-driver-forced-subpixel-antialising-after-update-to-9-x/td-p/359791

        "Notice the forced red "ghosting" on the right edge of black pixels"

        "You will notice a (hardware based ?, kernel driver based ?) subpixel anti-aliasing applied to the image […] it becomes anti-aliased even with ClearType disabled"

        "Notice that the solid background color looks different for 9.x and 8.x driver version"

        When they finally got someone from Intel to test, apparently they literally only ended up trying the 9.x driver instead of comparing it to 8.x and dismissed it because "they didn't see the blurry text"


        also these:

        https://imgur.com/a/hdvlU

        color fringing on vertical lines:

        https://forum.thinkpads.com/viewtopic.php?f=43&t=111111

        "Sharp edges are rendered in a fuzzy way. It seems that every pixel is blended very lightly with its four adjacent neighbors"

        finally, this one implies that it might have already started to affect some external monitors too (although someone else on page 3 said it didn't affect their monitor)

        https://forums.lenovo.com/t5/T400-T500-and-newer-T-series/Blurry-fonts-on-T420-with-64-bit-Win-7-Intel-HD-Graphics-3000/td-p/1093969


        interestingly, these posts also imply that there may be potential to create a strain-free setup by buying a ThinkPad X220, installing an older (pre-10) version of Windows that supports this specific older working v8.x driver they're mentioning, and then installing that driver… 👀

        dev