Eyestrain when switching from Windows to Linux
- Edited
KM Not really. It's without recommendation. But I saw that other users are quite successful with old ATI.
I choose that one :
Because is 9 years old.
Fits to rest of the setup.
Has 2x DP for my Double/Triple Screen Setup.
And was easy to find/order.
I don't have a clue if it will work, that is why I will provide some feedback after.
- Edited
Hey! Here is my experience about the same eye strain problem:
- I use DELL P2415Q monitors (4k at 60Hz)
- MacOS on macbook pro 2015 works seamlessly (main main work station for years)
- Windows 10 gives again beautiful pain-free picture on the same monitor
- Linux gives me eye strain instantly (as to many of you). I have tried intel gpu, amd, and nvidia cards. All of them gave the same eye strain on the same monitor.
I have not tested playing around with the dithering, so can't tell if that helps in my case. I am closely looking for the solution as I would switch to Linux as my main work station.
My guess is that the problem is about refresh rate. If I set it manually to 30Hz then it feels very painful to me, then if I put it to normal 60Hz it still gives me pain. So I feel that the driver somehow incorrectly set the refresh rate. However xrandr and things like that shows that the refresh rate is around 60. So I have no technical knowledge to understand how the driver can mess it up with the refresh rate anyways..
Pudentane I'm done folks, good luck with this thing.
"If it works, NEVER change it"
"Security risks" of running older software and OS's are 99.9% BS. If you practice common sense safe computing you won't have any issues. Nearly all viruses, malware, and security breaches still come from morons clicking links in sketchy emails.
"If it works, NEVER change it" -- 100% agreed.
"Security risks" of running older software and OS's are 99.9% BS. 99% not agreed. Nothing to do with it and all these vulnerabilities are overrated and you know it. Also, I don't click on any sketchy links in sketchy emails, that's so 90s dude. Not sure why people persist with it though.
- Edited
hello all!
sorry for eng, I am not native.
I can not believe this thread is exists. Linux is constantly improving. Even the version of my loveliest game with full version available only on linux that I play on linux. BUT, this incredible eye strain every time I restart using linux every years after years just kills my eyes and stopping using linux after few days. I had success to setup clear linux, fedora, opensuse, played this game for hours. Configured and messed with these destros for hours, and my eyes just become exhausted, and my blood vessels in my eyes become red and noticeable, and the feeling that someone with a needle just stabbing your eyes.
I know linux is very powerful on server, on supercomputer etc. But the desktop environment just kills my eyes. I set up Microsoft fonts with anti aliasing off, and with full hinting, with as sharp fonts as on windows. BUT is still very bad. ON kde and gnome too.
Until this issue is not fixed I can not switch to linux ever on desktop. even now when the full version of my precise game is only available on linux only ( reason: $ ).
Config. Amd 3300x and nvidia 1060 gtx asus. ( both gnome and kde were used ), old config was radeon 7850 I remember I got eye strain using linux at leas 10 years back or even more .
- Edited
Linux on my PS4 Slim is usable. Same as the PS4 itself is usable while running the PS4 OS ("Orbis OS"), which is based on FreeBSD. On PS4 Linux, Xorg is running, 3D hardware acceleration is running, and no eye strain. I was about to look for buyable graphics cards that use the PS4's graphics chips, but it turned out the PS4 uses a custom graphics solution that is not available on the PC market.
What Linux being usable on the PS4 probably means is that Linux itself and the Linux hardware acceleration code is not the per se the cause for eye strain, but rather the combination of GPU and graphics drivers is. Perhaps there are usable combinations out there that we can use with PCs?
The PS4 GPU is from AMD. As the past has shown AMD on the PC uses particularly aggressive forced temporal dithering, this info might not help us much though.
The PS4 Linux I tried uses the default window manager "jwm". Maybe that's a factor, too.
Edit: PS4 Firefox Quantum, hardware-accelerated and no eye strain.
- Edited
KM I did test some "uncommon" AMD graphics chipsets awhile ago with my capture card setup
https://wiki.ledstrain.org/docs/appendix/tests/dithering/jtl/
setting xrandr dither
property to on did enable temporal dithering "snow" as expected.
- Edited
Wonder if this is helpful on Intel graphics for PWM:
The problem was:
My Linux distro sets a low PWM frequency by default, that's why all the eyestrain. Luckily Intel gpu drivers can change the PWM frequency.
Linked in that forum post is this ArchLinux forum post. There is a command line tool called intel-gpu-tools
(link, source) which read/write values for Intel GPU driver. This can be used to increase PWM and (theoretically) shut off dithering (the links above are about PWM).
I also came across intelpwm-udev
, a bash script that sets the PWM frequency for Sandy Bridge chips. It uses/builds upon the intel-gpu-tools
, specifically intel_reg
, to make a nicer CLI tool around it. Here's the GitHub repo.
See also this Reddit thread where I started this research. I Googled "manjoro eystrain" to get these results.
Hope this helps someone. I might give Lubuntu a try on my Thinkpad T430s, since it looks like folks are reporting LXQT might be not bad for eyestrain.
- Edited
Yeah, these are pretty old posts, I've seen them around, but I think it's a different issue. I've had it on all distros, all GPUs and several different displays. It's not specific to Intel, although that could well spawn some additional issues, which other people had. I also no longer believe it's dithering. I recently did what Wallboy suggested earlier, that is using Nouveau with dithering off and the strain was still there. But I suspect it wasn't always the case, because Nouveau got much much better now. No glitches or artifacts, it works pretty well. Meaning, they may have improved the acceleration somewhat and that in turn may have brought the strain back. It's just a theory, though. The truth is, nothing really helps it. It's just there, period.
P. S. I'm kind of back, still not giving up.
I tried disabling dithering through xrandr
xrandr --output eDP1 --mode 3840x2160_59.97 --set "dithering mode" "off"
But I get this error
X Error of failed request: BadName (named color or font does not exist)
Major opcode of failed request: 140 (RANDR)
Minor opcode of failed request: 11 (RRQueryOutputProperty)
Serial number of failed request: 41
Current serial number in output stream: 41
Any ideas?
Can't test right now, but first I'd input "xrandr --props" to see what's allowed, and I'd also not use --mode to see if that helps.
- Edited
I tried removing mode xrandr --output eDP1 3840x2160_59.97 --set "dithering mode" "off"
but it just spits out unrecognized option '3840x2160_59.97'
Here's the output of "xrandr --props"
Screen 0: minimum 8 x 8, current 3840 x 2160, maximum 32767 x 32767
eDP1 connected primary 3840x2160+0+0 (normal left inverted right x axis y axis) 350mm x 190m
m
EDID:
00ffffffffffff004d10761400000000
311a0104a52313780ed353a85435ba25
0d525800000001010101010101010101
0101010101014dd000a0f0703e803020
35005ac2100000180000000000000000
00000000000000000000000000fe0059
32584e44804c51313536443100000000
0002410328001200000b010a20200056
BACKLIGHT: 375
range: (0, 1500)
Backlight: 375
range: (0, 1500)
scaling mode: Full aspect
supported: Full, Center, Full aspect
Colorspace: Default
supported: Default, RGB_Wide_Gamut_Fixed_Point, RGB_Wide_Gamut_Floating_Poin
t, opRGB, DCI-P3_RGB_D65, BT2020_RGB, BT601_YCC, BT709_YCC, XVYCC_601, XVYCC_709, SYCC_601,
opYCC_601, BT2020_CYCC, BT2020_YCC
max bpc: 12
range: (6, 12)
Broadcast RGB: Automatic
supported: Automatic, Full, Limited 16:235
panel orientation: Normal
supported: Normal, Upside Down, Left Side Up, Right Side Up
link-status: Good
supported: Good, Bad
non-desktop: 0
range: (0, 1)
3840x2160 60.00 + 59.97*
3200x1800 59.96 60.00 59.94
2880x1620 60.00 59.96 59.97
2560x1600 59.99 59.97
2560x1440 59.96 60.00 59.95
2048x1536 60.00
1920x1440 60.00
1856x1392 60.01
1792x1344 60.01
2048x1152 60.00 59.90 59.91
1920x1200 59.88 59.95
1920x1080 59.96 60.00 59.93
1600x1200 60.00
1680x1050 59.95 59.88
1400x1050 59.98
1600x900 60.00 59.95 59.82
1280x1024 60.02
1400x900 59.96 59.88
1280x960 60.00
1368x768 60.00 59.88 59.85
1280x800 59.81 59.91
1280x720 59.86 60.00 59.74
1024x768 60.00
1024x576 60.00 59.90 59.82
960x540 60.00 59.63 59.82
800x600 60.32 56.25
864x486 60.00 59.92 59.57
640x480 59.94
720x405 59.51 60.00 58.99
640x360 59.84 59.32 60.00
1280x720_334.00 333.81
3840x2160_59.97 59.96
Maybe I should try lowering the BPC from 12 to 8 or 6? Haven't figured out how to do that yet on Intel
There is no dithering property showing up, so it's probably not supported by the driver.
- Edited
cizeta Haven't figured out how to do that yet on Intel
I don't know for sure -- but I think the process to disable dithering on Intel should be similar to Windows? The ditherig
source code is here: https://github.com/skawamoto0/ditherig/tree/master/ditherig. It won't compile on a *nix system because it uses Win32 libraries.
But from what I can tell, it's writing to a PCI register to disable the dithering. I think you might be able to use the intel-gpu-utils or PCIUtils to do this.
(This is not my area of expertise and this is just speculation / food for thought).