Windows 10 absolutely introduces a new type of flicker or difficulty. It started with the "Creator's Update" which was the second update in 2016. It is "better" recently, with releases 1809 and 1903, but not perfect. The only version of Windows 10 that functios like Windows 7 did... is 1511, the final 2015 release. Some people have found that the first 2016 release was ok also.
Windows 10 causing FRC, Dithering, PWM or something on same hardware?
Whilst others report problems with windows 10, I was unable to find any evidence of dithering when testing due to the operating system alone. Not doubting others, but its something to bear in mind.
When doing the same testing I found that my GTX660 does use temporal dithering (along with every other windows 10 compatible card I tested), so I imagine its quite likely your GTX650 does aswell. Though, potentially it didn't when your computer was running windows 7 16bit mode. The only option I found that did not have temporal dithering was intel's integrated graphics. I have been told that 59hz and 60hz are the same, that 59hz is short for 59.99999hz.
Best advice I can give is to try a few different things and see how you go, and expect it to often not make sense. For example, I found going from my gtx660 which dithered, to intel integrated graphics which does not, to actually make my own problems worse. I think I'd simply gotten so used to it, that it was uncomfortable without.
Seagull I don't think Windows 10 "dithers". But there's something about their new "composition spaces" overlay system that makes my eyes want to crawl out of my head. I can demonstrate it quite plainly by queueing up the 16xx update on my laptop, letting it install, going OW OW OW and rolling back to 1511 and going "AHHHHHH".
Thanks. Unfortunately, installing 1511 is not an option for me.
Is the 8-bit IPS display indeed supposed to be able to deal with 32-bit color depth without any of those things which I call 'tricks' (changed Blanking, dithering, FRC, PWM, reduced refresh rate) etc?
Is it possible that as long as I was using 16-bit color depth, the various components (the GPU, DVI-D-single link cable, the display, the motherboard, the bios or whatever other part wereable to cope with the 1920x1200 resolution with 60Hz, but thatthe 32-bit color depth is in some way too much for one of the components, and that this results in some 'trick' such as changed blanking, FRC, dithering , or something else?
NST17 that's sort of a loaded question. In general, yes. True 8-bit screens should ... SHOULD ... be able to display 32-bit color without problems/tricks/compromises. In practice, however, this is seldom the case.
However, in your particular situation, Windows itself is adding some unpleasantness.
Is windows 10 doing more than forcing me to use the 32-bit color depth and is there a way to solve this with either software of hardware?
Yes, it is. And no, there isn't. At least not that we know of.
I just realized that the forum's name is LEDstrain, but as far as I know my display does not have a led backlight. Does that imply that there is still hope for me/ my display?
What is it that Windows 10 does differently which causes the problem? (I may not understand the answer, but it may possibly help in figuring out what's next).
Is there any solution that involves replacing hardware so that it won't have to do those 'tricks'?
Thanks
- Edited
NST17 LEDStrain because a lot of our problems started when LED backlighting was being rolled out and widely adopted 2008-2012, but actually a lot happened during that period as GPU, drivers, OS display technology etc changed a lot as well, so LED technology is not the beginning nor the end of our problems.
Is it possible to figure out whether my hardware would have been able to cope with a 32-bit color depth on Windows 7 without using 'tricks'?
- Edited
NST17 Welcome to the forum.
I have recently bit the bullet and bought a Dell U2419H and Intel NUC (NUC8i7BEH) which is running the latest Windows 10 Pro + Latest Intel Graphics Drivers.
I have been using the machine consistently for the last two weeks. I wanted to allow that much time to get over any possible placebo affects and also rigorously use the machine to see what would happen if I used it 8+hours a day. The side effects were quite profound; I noticed after day 1-2 that I was blinking less than usual, my eyes had trouble focusing to distances when going outside, feelings of tunnel-vision, also looking at my reflection showed my pupils were dilated a LOT. I did have tension headaches for the first few days, however this actually subsided. I finally connected my known good desktop to the U2419H 2 days ago, and within 5 minutes I immediately felt relief - the tight band around my head had been released . I've been on my old desktop for the last 2 days and can now compare and contrast how I felt on the new NUC. It almost felt like I was being drugged, and that's not hyperbole.
The positive thing at least in my experience is that this new Dell monitor doesn't cause much strain, no different to my U2414H I bought a few years ago. It clearly is the OS/Driver causing the issues (in most cases) and not the display.
As to the where/why/how it's happening, I don't know. If anybody here can get a VMWare licence from their employer or find a way to obtain a copy (for evaluation purposes) then it should be straight forward to set up a host/client using PCoIP (which works at the pixel level) and document activity between good/bad machines. There is a log viewer available from one of the VMWare developers which has a section for pixel data, perhaps we can reach out to him and ask if different styles of dithering (spatio-temporal, temporal) can be measured using his tool.
Aside from a very expensive capture setup, it could be another approach to consider.
I suspect subpixel rendering, or some other tomfoolery. People have found relief (not consistently) with color profiles, switching display ports, etc. and that indicates that it's a rendering issue.
diop If anybody here can get a VMWare licence from their employer or find a way to obtain a copy (for evaluation purposes) then it should be straight forward to set up a host/client using PCoIP (which works at the pixel level) and document activity between good/bad machines. There is a log viewer available from one of the VMWare developers which has a section for pixel data, perhaps we can reach out to him and ask if different styles of dithering (spatio-temporal, temporal) can be measured using his tool.
if I understand correctly, you suggest using vmvare as a client-side. But what will be the server-side? Is PCoIP a purely software thing? I thought it was semi-hardware
- Edited
glvn if I understand correctly, you suggest using vmvare as a client-side. But what will be the server-side? Is PCoIP a purely software thing? I thought it was semi-hardware
It used to be. Here's some info from a quick search..
Teradici created the PCoIP protocol and debuted it in 2007. Initially, PCoIP was a hardware-based desktop virtualization product designed around a blade server that rendered desktop images, and a client device that somewhat resembled a hockey puck. The client device was equipped with a proprietary chip that enabled the use of PCoIP communications between the client and server.
PCoIP initially depended on proprietary hardware, but Teradici eventually created a software version of the PCoIP protocol. Teradici licensed the software to VMware in 2008. VMware used PCoIP to deliver virtual desktops with what was then called VMware View, now Horizon.
So in theory with VMWare software and a license, you can 'roll your own' VMWare server/client setup and then with third-party tools, analyse the PCoIP data, which is transmitting the desktop information at the pixel level. Installs could be performed on good/bad machines and it should be possible to see if temporal dithering is used (more bandwidth than usual being transmitted or more pixels being delivered).
The end result would hopefully show differences between good/bad OS & Drivers, and as the pixels are being transmitted 1:1, the movement/dithering algorithm should be able to be detected. This won't help with bad physical outputs, but would in my view be something on paper to show to M$/Apple, "this OS and Driver is comfortable but this other OS and Driver isn't because of difference in dithering/pixels, and here is the PCoIP data which proves the pixels are using X dithering algorithm."