So my aging Windows 7 gaming laptop is finally at the point where it struggles with day-to-day tasks.

I've decided to finally rebuild my gaming rig. I have known-good monitors (Dell 2317, yes LED but they work for me when attached to Intel graphics) and a theoretically-good (it was in my kid's machine) video card - MSI Gaming 970 4GB. Running Windows 10 21H2, which is good for me on other systems (work, for example).

I've put them into a Dell Precision 3610 running a Xeon processor… and the display is BAD. Very blocky fonts, and hard to look at.

Not sure what's going wrong here, but the same card and same setup on my kids' machines (both of which are last-gen Core i7 boxes) look fine. One kid has a GeForce 780 and the other has a GeForce 1660, both look … fine (the 780 is better, naturally, but both are workable). This one looks like ass.

Thoughts? Have we determined that the CPU/chipset matter? I wouldn't think so, but this test was pretty conclusively awful. The display just looks BAD, which seems like maybe an OS setting thing. I couldn't get cleartype to a place that I like.

Mind you, I use my Dell Latitude 7490 every day, for hours on end, with no problem now (better glasses and single-eye training and lots of macular supplements sure help). I guess I could use THAT (or one just like it) with my external monitors, but the dock tends to mess with the display and make it bad… and I really want a little more oomph.

    Hi! There is no correct answer for your question. There are so many factors: CPU, GPU, GPU Driver, MotherBoard, motherboard chipset drivers, firmware of computer, operation system. operation system version e.t.c

    I think that we have a lot of factors that we even can't imagine.

    I was surprised and upset, when i noticed that CPU and MotherBoard matter for eyes. You can read my answer here https://ledstrain.org/d/1685-new-graphics-card-eye-strain-issue-could-motherboard-impact-it/12(and the whole topic). Something has changed with new cpu/motherboards. Every computer newer than intel 8th generation has this problem. And nobody know. what was changed.

    Gurm Nice hearing from you again

    I think the key towards making progress on these issues is empirical research, as opposed to the "throwing darts" I'm sure we're all used to. If you were here I have some ideas involving my lossless capture project that could probably isolate output issues such as this. Failing that If you could obtain the same capture hardware and coordinate testing of multiple "scenarios" of terms of hardware/software used that also might work.

    4 days later

    I may need to upgrade soon, too. I'm out of touch with the current technology. My first requirement would be something that would not interfere with graphics cards at all, as in, I plug in the graphics card and can be sure the rest of the PC hardware does not interfere with the card's video output. Would I have to avoid specific CPUs or mainboard types? Which ones are safe?

    dev