whystrainwhy Also, is it possible that true 8 or 10 bit displays would make the dithering unnecessary or do you think the dithering would just happen anyway?
Unfortunately there are still "reasons" to dither even if the desktop and monitor have the exact same bit depth - mainly color calibration.
An 8 bit desktop has 256 shades of each RGB component, but if a calibration is applied e.g. to apply "night shift" settings, gamma correction, automatic contrast enhancement, displaying sRGB gamut content "mapped within" a P3 gamut environment, LCD panel uniformity compensation and such… suddenly, you can't directly display all 256 shades of gray anymore because the color range is being "crushed" by the calibration.
The calibration will result in a ton of colors getting mapped to floating point values instead of whole numbers, meaning that "just rounding them up" would cause some form of subtle banding, no matter how high the bit depth.
GPU companies are wanting to eliminate banding whenever they can in order to make images look "smoother" so dithering is applied to "fake" these in-between shades of colors. Because even if you have 8 bit content connected to a true 8 bit monitor, there's going to be enough layers of calibration between the desktop and the output that a 1-to-1 mapping isn't possible without either rounding up/down (banding) or dithering.
The push for HDR makes all of this even worse, because the goal of HDR is to represent even more colors, such as a shade of white that's "brighter than typical pure white". All of the math required to map pretty much the "totally arbitrary and relative floating-point shades of colors" that HDR requires into even a 10 bit output while still remaining able to "eliminate" banding = dithering everywhere
This can still all be avoided by simply rounding up/down instead of dithering, but GPU companies don't want to do that as it would make their image "quality" and color "accuracy" look "worse" compared to competitors.
Today, they just throw dithering on everything as a catch-all "solution" - even when the banding would be slight enough to the point where no one aside from tech reviewers and the film/photography/publishing sector (AKA professions where color "accuracy" is important above all else and text clarity and readability do not matter as much) would care.
This is also why expensive "Pro"/artist/studio-branded products are even worse in regards to text clarity, eye strain, and ability to clearly present information-dense UI and content, compared to cheaper and "less color accurate" products.