I found a very interesting and old (but still relevant) forum post linked below.
It questions the jump to 8 bit and above in the first place and also our ability to actually perceive millions of colors.
In need of a new LCD monitor recently, I went searching at the local Best Buy for a suitable replacement. Even though I gave up on trying to get my recently acquired Mac computer to do anything useful and I eventually returned it, that Apple Cinema display was just gorgeous! In fact, it spoiled me to the point that looking at my existing LCD PC monitor made me almost wish I had that Mac back on my desk. So I went in search of a good LCD monitor for my Windows 7 system. Surely I thought, technology has improved in the past 3 years since I bought my last LCD monitor and I'd be able to just run to the local store and buy one equivalent to the Apple Cinema displays? What I found were a host of new-tech LED backlit LCD displays that looked like they had potential by their specs, but after going to the store to review them, I was in for a big surprise. I ended up losing almost 2 days of work trying to sort out the junk that is being sold now, researching different panel types such as TN versus IPS panels, and somewhere along the way, picking up on what appears to be a well guarded secret in the industry: that your typical LCD monitor that you find at your local brick and mortar stores can only display about 262,000 colors and not the 16.7 million that is being used by your 24 or 32 bit video board!
It seems highly plausible to me that as a fail safe to ensure a uniform 8-bit+ color experience across all panel types, tech companies are shipping products with dithering enabled at the hardware level. I suspect that there are hardly any true 8/10 bit monitors out in the wild for the average consumer, and as a result our symptoms of eye strain in recent years is the insistence of the industry to make our computers look more vibrant, to sell more products and add more numbers on the end.
OTOH it has also been shown that when comparing a temporal dithered 8 bit monitor to an actual 10-bit monitor, there is a negligible difference, or at least it is good enough for 99% of users. On photography forums some Mac users are not happy about the forced temporal dithering by Apple as they believe they are wasting thousands on cinema displays only to be fooled by software trickery, and not seeing what the panel is natively capable of.