I'll try next to use Display port to VGA converters, maybe VGA itself is able to "dampen" the output in some way and maybe it can have a good effect on how my eyes would like to see things, just like in normal circumstances.
Now here I was thinking about something, which is purely some theory, because I'm not an expert of this field:
Regarding to Wikipedia, HDR/Dolby Vision/etc uses the Rec. 2100 or BT 2100 colorimetry, you can see it here:
https://en.wikipedia.org/wiki/Rec._2100
At the end of the page it states also, that for SDR the standards Rec. 709 sRGB Rec. 2020 BT.1886 are used, which I find usable, i.e. in the TV I can set BT.1886 for example and its fine to live with it.
A colorspace comparison:
https://en.wikipedia.org/wiki/Rec._2020#/media/File:CIE1931xy_gamut_comparison_of_sRGB_P3_Rec2020.svg
I could imagine two things; either are these higher colorspaces disturbant to eyes, or these higher colorspaces are somehow not-real/faulty represented by displays.
If these colorspaces are disturbant, then we people here are simply taken as "below threshold" ones? No idea.
If these colorspaces are faulty (EDIT: here can fit also dithering probably), can it be like for example they found a methodology on how to emit a very similar colorspace on a display which does not support it from HW? Not all displays are HDR, but this way maybe they can produce similar-to-HDR screen? Just like games, having the very same properties and effects? No idea and then why is the HDR/Dolby compliant TV causing the same problems?
Maybe both of these are possible?