When you start dealing with interlace, the concept of "frame rate" gets a bit muddied. There are three things to consider:
1) The rate at which vertical sync is asserted to the monitor - this is the "field rate", or the rate at which every half of the interlaced frame (known as a field, consisting of every other line) is updated
2) The video frame rate - this is half the field rate, or the rate at which the entire display gets updated on the monitor
3) The graphics update rate - this is the rate at which the game updates the framebuffer from which the video is generated. This can be anything the game wants, but is usually an integer (often power of 2) divisor of the field rate, with the frame rate being popular, for several reasons. This need not be equal to the frame rate even on progressive video.
TVs (both NTSC and PAL) use interlaced scanning to achieve higher resolution in a small amount of RF bandwidth. The downside is that you only get half the picture at a time. However, on TV the rate at which the picture is updated is once per field, not frame. Many computer games only updated the graphics frame once per progressive frame, in order to prevent tearing, but scanned out interlaced for compatibility with TVs (and standard resolution arcade monitors). This eliminates graphical artifacts (mostly tearing) due to the interlace scan, but also makes the video appear kinda choppy. With some care, the graphics can be updated once per field, but then you have to watch out for tearing.
You can really see this all going on with Dance Dance Revolution if you compare 4th Mix (or older) and 5th Mix through Extreme. The monitor is standard res. Old versions ran a 480 line (roughly) framebuffer and scanned it out interlaced, but only updated every full frame. The monitor didn't flicker (30Hz monitor flicker is EXTREMELY noticible and quite objectionable), but the graphics were pretty choppy due to the slow update rate. 5th Mix and newer update the framebuffer every field. The graphics are a lot smoother in motion now, but on horizontal motion, you can notice some tearing due to the interlace.
Many arcade games simply ran ~240 line progressive video at 60Hz to get standard resolution compatible timings. 480 lines on a standard res monitor is possible per the above, but seemingly "odd" things crop up due to the interlace.
Personally, I think interlace is a dirty hack and needs to go away, but we're stuck with it on ATSC due to the popularity of 1080i.
Now, this is where your 30Hz comes from. I'm guessing that the monitor is still refreshed at 60Hz, but the framebuffer is only updated at 30Hz, resulting in choppy graphics. Games which update their framebuffer at 60Hz won't exhibit choppy graphics, but if you watch for tearing on horizontal motion, you'll probably see some.
Of course, now that VGA monitors are reasonably common in arcades, you see plenty of games running ~480 line progressive scan, which looks pretty darned nice.
I'm not saying any of this applies to Tron or Tapper (I have no idea), but it's something to bear in mind.