Most of all game have vertical resolution below 288 , so you want have flickering with them. But more recent game have superior resolution ,so for them it will flicker.
The only thing I have found to flicker is Windows and standard Windows applications. Even VGA res games do not flicker running interlaced on a CGA monitor from what I've seen.
You must have been using a CGA (or similar) CRT that had a slow phosphor decay. (IE, when the phosphors were charged, they stayed lit for about 2 fields). I have dealt with CRT's in that era, and have found some to be very "flickery" even for 4096 or 262144 color video (I still say the only real computer is the AMIGA!!!!). I guess it depends on the amount of time that the phosphors emit light after being charged. Too long a latency and you get blur, to little and you get flicker.
Unless you are dealing with a modified Occiliscope turned to a monitor, then it shouldnt be too bad for blur... (BTW, most CGA-style monitors I have dealt with (IE the B/W ones) were meant for text or CAD applications of the time, so most likely had slow response times to make for an image that is much easier on the eyes.)
ps, Yes I do realize that CGA stands for "color graphics adapter", but what I am refering to are screen of 15 kHz that take standard analogue inputs (CGA, EGA, Composite, seperated luma/chroma, etc)