A mesh of LED lights won’t affect inversion, that’s solely within the lcd panel.
Onto your second question
First let’s re-examine what LCD inversion is. Liquid crystals will degrade if you apply a constant current to then. So to prevent this degradation every frame the voltage is switching from + to – or – to +. If this switching was perfect we wouldn’t see anything changes, but in practise these voltages aren’t perfectly aligned so the crystals shift every frame.
So every frame some pixels get brighter, some get dimmer. The next frame, they swap so the dim ones become bright, the bright ones go dim. It creates a kind of flickering effect at half the frame rate of the panel. Its half because it takes 2 frames to complete the cycle bright->dim.
All LCDs will do this, though it will be weaker or stronger depending on the inversion implementation. I personally think that panels with higher pixel density will be worse, as smaller pixels mean smaller electronics and more variation and slacker production tolerance. So I expect laptops to be worse than full size monitors.
[LCD Inversion + Superposition of Dithering]
So getting back to your original question, how can you make a good setup bad? Well let’s compare inversion with temporal dithering. In temporal dithering pixels are going from bright->dim etc just like inversion. But they aren’t necessarily doing it at half the panel frequency. I found in my testing that whilst some pixels may be dithered every other frame, others might only change every 60 frames, or longer. So temporal dithering is displayed at simultaneous frequencies between 0hz and half the frame rate of the monitor. A kind of broadband flicker.
I think that the broadband flicker of temporal dithering might be an effective way of breaking up the half frame rate flicker caused by LCD inversion.
So back to changing setups. If you change the colour profile, you will be changing how temporal dithering occurs. Basic white (0,0,0) could now be offwhite (0,0,1) which was previously dithered at x hz might now be dithered at y hz and that could feel better or worse. Changing a GPU will give an entirely new temporal dithering algorithm. Changing from digital to analogue display outputs will have an effect as interference will make VGA fuzzier (more broadband flicker). Perhaps a windows update changes how alpha channel is processed and that has a knock on effect on dithering and colours.
The next question then, is why would a broadband flicker be better than a fixed frequency flicker? Or even, why would one fixed frequency of flicker feel different to another?
To examine this I will touch upon the field of migraine theory, which is undergoing something of a renaissance at the moment. The first important concept: migraine has recently been characterised as a protective measure for the brain, a kind of emergency shutdown to protect the brain from over exerting itself and causing damage. Following the shutdown changes in brain chemistry consistent with increased repair and growth are seen https://neurosciencenews.com/migraine-oxidative-stress-7761/.
The next important concept. Migraines are now thought be triggered in response to brainwave synchronisation. To understand this, you need to know that the brain operates with repeating pulses of electrical activity – kind of like how a CPU operates in clock cycles. And that the faster you think, the faster the frequency of these pulses. Also be aware the brain is not like one big CPU, different parts of the brain perform different functions (vision, movement, hearing etc) at different speeds. So your brain is more like a hundred different CPUs operating independently. Problem is, if one ‘CPU/bit of your brain’ is running particularly fast, that speed can spread to other parts of your brain – even though the parts of your brain its spread to have no processing to do. A migraine can be triggered as an emergency shut down to stop your brain wearing itself out when this occurs https://www.aps.org/publications/apsnews/200405/migraines.cfm.
So you are staring at a 60hz LCD screen. Its flickering at 30hz due to inversion. The visual processing part of your brain has now matched this frequency and it too is operating at 30hz https://www.jneurosci.org/content/39/16/3119. This could spread to other parts of your brain, triggering a migraine as an emergency shutdown measure to prevent damage from overstimulation.
What if you don’t get a migraine? There is a genetic component to migraines, so some people just don’t get them. Well, 30hz brainwave corresponds to normal but intense brain activity https://www.scientificamerican.com/article/what-is-the-function-of-t-1997-12-22/.
So perhaps rather than a migraine, these people get normal headaches due to their brain over exerting itself. Perhaps they get facial twitching because the part of their brain that controls those muscles is now operating at 30hz when it should be idle. The symptoms of your brain overworking itself could be anything really.
This brings me onto something I think about a lot. Why do people hate intel graphics so much? I think it’s not because intel graphics dither (they don’t, I’ve tested lots of them). I think people get pain from intel graphics because they are not temporally dithering! So they are seeing a strong half framerate flicker, undiluted by temporal dithering. Something I think will be worse on laptops where their high DPI screens will have more voltage irregularities creating a stronger LCD inversion flicker.
Personally, I have a monitor that I can only use with GPUs which dither. If I hook up my non dithering intel GPU I get pain. The above is the best explanation I have for that.