Main > Main Forum
Tron and Tapper were at 30Hz????
spyhunter:
Hey guys, any new ideas? Thanks! There's gotta be a way to get these Midway games to display on something other than their original CRTs, I refuse to believe that they are CRT exclusive. The installed LCD in my game is just too sweet to give up only to go back to the dinosaur of a CRT that was in it.
I've since tried this rgb to vga converter which is essentially the same thing built into the LCD and no change:
http://www.ambery.com/rgbcgatovgac.html
What device could double the 30Hz refresh rate to 60Hz in order to use a more modern display??
SH
ahofle:
Just curious, why did you buy a $450 untested LCD for your Spy Hunter when you could've just gotten a 19" arcade monitor for under $200?
SavannahLion:
--- Quote from: ahofle on December 20, 2007, 04:29:48 pm ---
--- Quote ---Flicker is only visible if there is a contrast difference in alternate lines.
If a low-res arcade game is double-sized, the odd and even lines are exactly the same and there is no flicker.
--- End quote ---
--- End quote ---
Score one for the CRT team. Why didn't anyone dig up that information when the stupid CRT vs LCD debate was going on?
--- Quote from: spyhunter on February 05, 2008, 03:08:28 pm ---What device could double the 30Hz refresh rate to 60Hz in order to use a more modern display??
--- End quote ---
It's a line doubler or scanline doubler. Kind of expensive unless you can locate the parts to build your own. 60Hz? Except for ultra-new monitors, most every one I've seen supprt as low as 30Hz.
MonMotha:
When you start dealing with interlace, the concept of "frame rate" gets a bit muddied. There are three things to consider:
1) The rate at which vertical sync is asserted to the monitor - this is the "field rate", or the rate at which every half of the interlaced frame (known as a field, consisting of every other line) is updated
2) The video frame rate - this is half the field rate, or the rate at which the entire display gets updated on the monitor
3) The graphics update rate - this is the rate at which the game updates the framebuffer from which the video is generated. This can be anything the game wants, but is usually an integer (often power of 2) divisor of the field rate, with the frame rate being popular, for several reasons. This need not be equal to the frame rate even on progressive video.
TVs (both NTSC and PAL) use interlaced scanning to achieve higher resolution in a small amount of RF bandwidth. The downside is that you only get half the picture at a time. However, on TV the rate at which the picture is updated is once per field, not frame. Many computer games only updated the graphics frame once per progressive frame, in order to prevent tearing, but scanned out interlaced for compatibility with TVs (and standard resolution arcade monitors). This eliminates graphical artifacts (mostly tearing) due to the interlace scan, but also makes the video appear kinda choppy. With some care, the graphics can be updated once per field, but then you have to watch out for tearing.
You can really see this all going on with Dance Dance Revolution if you compare 4th Mix (or older) and 5th Mix through Extreme. The monitor is standard res. Old versions ran a 480 line (roughly) framebuffer and scanned it out interlaced, but only updated every full frame. The monitor didn't flicker (30Hz monitor flicker is EXTREMELY noticible and quite objectionable), but the graphics were pretty choppy due to the slow update rate. 5th Mix and newer update the framebuffer every field. The graphics are a lot smoother in motion now, but on horizontal motion, you can notice some tearing due to the interlace.
Many arcade games simply ran ~240 line progressive video at 60Hz to get standard resolution compatible timings. 480 lines on a standard res monitor is possible per the above, but seemingly "odd" things crop up due to the interlace.
Personally, I think interlace is a dirty hack and needs to go away, but we're stuck with it on ATSC due to the popularity of 1080i.
Now, this is where your 30Hz comes from. I'm guessing that the monitor is still refreshed at 60Hz, but the framebuffer is only updated at 30Hz, resulting in choppy graphics. Games which update their framebuffer at 60Hz won't exhibit choppy graphics, but if you watch for tearing on horizontal motion, you'll probably see some.
Of course, now that VGA monitors are reasonably common in arcades, you see plenty of games running ~480 line progressive scan, which looks pretty darned nice.
I'm not saying any of this applies to Tron or Tapper (I have no idea), but it's something to bear in mind.
DaveMMR:
--- Quote from: SavannahLion on February 05, 2008, 04:13:34 pm ---Score one for the CRT team. Why didn't anyone dig up that information when the stupid CRT vs LCD debate was going on?
--- End quote ---
Because it would have been "debunked" by half-truths and nonsense. ;)