Dizzy there's also the chance that the pluggable adapter uses a more """advanced""" chip for either its HDMI controller or decompression module and adds stuff like noise reduction, oversharpening, or additional dithering to the image.
since DisplayLink is lossy compression, not lossless, there's many different ways that stream can be decompressed… possibly Pluggable doesn't even "know about the problem" themselves -- but instead whichever other 3rd-party manufacturer made the hardware decoding chip they put inside their adapter.
if that's the case, it's pretty likely that a chip decoding lossy data will try to tack on some mystery "image enhancement" tech to "compensate" for the compression…
the amount of modification to the image, what types of processing it might use, and whether those types of processing mess with your vision, can totally change depending on the hardware within each adapter
so it's really just trying different ones until you get lucky
for example. recently have been looking at data sheets for HDMI and LCD controllers as early as the mid-2000s(!) and there are so many mentions of everything from "temporal noise reduction", tons of different dithering methods, "local dyamic contrast enhancement", the list goes on.
(my current understanding is that a lot of this stuff was designed for TVs, then basically gets reused for every other type of display tech because "that's the most cost effective", even if it's totally unfit for e.g. information-dense UI content and makes text look terrible)
so it makes sense why each adapter outputs a slightly different image
-
the only way to figure out exactly why would be to get a lossless capture card, take a direct frame grab from the capture card of each adapter displaying the same exact desktop UI, and compare the colors of each pixel