I believe this all comes down to flickering/flashing.
Back in the days of CRT's, I was able to get by using 75hz monitors. Then later 120hz CRT's came out.
With LCD's I had issues until they started making the gaming versions which were 120hz, which at the time were very expensive. Since then I have mostly used gaming laptops with 120hz screens and until recently could find one that worked.
But now it seems that even 120hz no longer works for me. So as you said, I may try 240hz and see if that works.
But overall, yes I do believe it is a flicker issue in all cases. Another piece of evidence that supports this (at least to me) is that I can use the same Xbox console with the same TV and have no issue with one game, yet another game will give instant eyestrain, even at just the menu screen. Obviously, the way those games are being refreshed is the issue as the hardware is exactly the same.
Also, I will sometimes have a computer setup that works for me, but certain streaming videos will cause instant eyestrain even though other activities are fine on that same setup. Streaming videos use tricks with the framerate, so somehow a certain framerate must "gang up" with whatever setup I have to reach some irritation causing frequency of flicker.
So basically, every element of your setup, from the OS, to the monitor, to the graphics card, to the content on screen is causing some amount of flicker that all adds up. The sum total of this flicker detected by your eyes either falls into the non-irritation zone or the irritation zone. Changing one thing can sometimes be enough to change the total flicker and remove the irritation. The flipside of that is one small change to any one of those, such as a driver update, can make it go in the other direction and cause a good setup to now have issues.
That's why I seem to have decided to just go after screen refresh rates as a solution as that's the easiest for me to control and seems to have the most effect.