Build Your Own Arcade Controls Forum
Main => Main Forum => Topic started by: slapaham on February 26, 2009, 01:41:58 pm
-
I'm sure I know what the answer will be on this but thought I'd ask - I got a load of touchscreen CRT monitors and I saw that at least one of them had composite connections as well as (obviously) VGA. I'm guessing for the best look to an arcade game I would opt for the composite connection over the VGA?
I'm thinking of building an Xbox MAME cab with one sometime in the future possibly...
-
Composite is horrible no matter what. Go VGA.
Do you mean component?
-
Arcade monitors are connected via a type of RGBS or RGBHV cable.
A vga cable is very very similar to that.
A composite connection will introduce NTSC interference and artifacts that arcade games never had.
-
I think I must have got the wrong end of the stick then! :banghead:
I have heard a lot of people saying that CRT TVs are great for displaying MAME games - but perhaps these are American TVs? From what I've gathered, NTSC TVs use S-Video while PAL TVs use composite or scart...
-
No no no no no. You have gotten the RIGHT end of the stick indeed.
If these monitors have composite connections then that means they are able to display the low resolutions needed by arcade games without any scaling.
With this monitor you should be able to make it look *exactly* like an arcade monitor with the use of an arcadeVGA graphics card or a program called Soft 15khz.
If you will make a thread with the model of your monitors in the monitor forum I'm sure we can help you along.
I don't think you should use an Xbox for your cab. There's no way to take full control of the potential of a low-res touchscreen monitor with an Xbox.
-
SCART ! SCART is closest to being the same as a true arcade monitor which uses R G B and Sync
Component is close, but carries sync on the green signal, so technically SCART is better.
-
Putting sync on Luma (or green) does nothing to alter the quality of the video signal. It just saves a wire. Many high-end/professional CAD systems use(d) Sync-on-Green. They were running high-end coax cables for the video, so saving a cable or two by not having to run separate sync lines could mean real cost savings with no impact on video quality.
"VGA" is RGB with separate syncs; SCART usually uses RGB with sync on a composite video line (CVBS). VGA connections can also use composite sync (like arcade games do). In this case, sync is conventionally placed on the horizontal output and the vertical sync output is either tri-stated or left in a static state. Sometimes the vertical sync output just duplicates the composite sync that shows up on horizontal.
Now, North American component is in a different colorspace (YPbPr) than arcade games and euro SCART (both RGB), but that has minimal error in conversion: the conversion can actually be mathematically expressed losslessly, but it's usually done digitally so there is a little bit of round-off error and quantization error that is often not detectable even with instrumentation, let alone the human eye, since it's already down in the noise.
RGB and YPbPr component should look identical upon even close inspection if done properly. S-Video (Y/C) is close, but a side by side comparison will favor the RGB/YPbPr setup, and it will depend on the quality of the TV how good it turns out. Composite NTSC/PAL tends to look pretty bad on anything with sharp color edges (which pretty much describes video games).
Regarding the O/P's question, it's possible that your monitors may not support low resolutions over the "VGA" input. Many TVs with "VGA" (PC) inputs only go down to 480p (actual VGA timings) over that input, not 480i (CGA or "standard res"). I have no idea why not, given that all the component inputs on those same TVs will, but they don't. Give it a shot, though. It shouldn't hurt anything, and it'll look a heck of a lot better than composite video.
-
My video card gives me the options of S-Video and DVI, If my monitor also has both, which one am I better off using? I am wanting to get the closest arcade feel possible. Or would I be better off getting a video card with component? These are hypothetical questions as I do not yet have a monitor but I am just trying to plan ahead. Thanks!
-
The decision between S-video and DVI depends on the type of monitor you end up using.
If you have a LCD monitor then you want to go with DVI. You should be trying to get the best picture possible to your monitor, and DVI is superior to S-video for that purpose. After you have the image on screen then you can use the software settings in mame to try to get it to match an arcade monitor.
IF you are using a CRT monitor then it will most likely not even have a DVI port and you will have to settle for S-video.
And yes, it would be better to have a card with component than one with S-video. But like I said, if you plan on using a LCD screen then you will want to use neither. You will use a VGA or DVI cable.
-
Thanks! That answers my question perfectly. :)
-
So... on an LCD monitor, when given the choice between a VGA connection or a DVI connection which is best for gaming? (particularly of the mame or visual pinball variety although I do still play PC Games like Call Of Duty 4 on it)
From the manual I read it that if I use the DVI connection then I'm locked into a 60HZ refresh rate. But if I use the VGA connection then whilst I still am still locked to 60Hz for the monitor's default 1920x1200 resolution, if I shift down the scale a bit then I can get higher (70-75Hz) refresh rates.
-
So... on an LCD monitor, when given the choice between a VGA connection or a DVI connection which is best for gaming? (particularly of the mame or visual pinball variety although I do still play PC Games like Call Of Duty 4 on it)
From the manual I read it that if I use the DVI connection then I'm locked into a 60HZ refresh rate. But if I use the VGA connection then whilst I still am still locked to 60Hz for the monitor's default 1920x1200 resolution, if I shift down the scale a bit then I can get higher (70-75Hz) refresh rates.
LCD's are weird birds when one starts looking at refresh rates and resolutions. DVI is the purest connection you will get to an LCD panel. The probable reason for higher refresh rates on the VGA port is that the device inside doing the analog to digital conversion will accept those rates, but that may have little bearing on what the panel displays. If the DVI interface locks you at 60hz, it's possible that the panel will only ever actually display at 60hz, even though you may be feeding it something higher before the conversion. Resolution is the same way with an LCD. All images (!) are displayed at the native resolution of the LCD. Such is the nature of digital displays. You can feed them smaller (and sometimes larger) resolutions, but they always get converted to the native resolution of the panel before being displayed (if shown full screen).
Long story short, there is no short answer for LCD panels, because there are too many variables. But you might like the slightly fuzzier image you will probably get from the VGA input for MAME, but a sharper image from DVI might be more to your liking for VP or more modern titles.
RandyT
-
I have no idea why not, given that all the component inputs on those same TVs will, but they don't.
I found an old article about this once. I'm a little hazy on the details since I was actually researching something else entirely. But I digress, I think the reasoning is that the manufacturers claim it's a cost savings measure in the circuitry. Something to do with not requiring an extra divider/multiplier I think. This is coupled with the capabilities of the CRT itself. To save a few pennies on each monitor? :dunno
Yet another article stated that 15Hz (same for 30 and 60Hz) is undesirable due to interference (ie flicker) from fluorescent lights. My workplace actually requires our monitors to use refresh rates higher than 60Hz. (I just yank the damn bulbs out from above my office.) I think this is a load of crap as well since very few modern workplace computers are that old as to actually still use anything lower than 60Hz.
I can think of other things that might be factors, but those are all just theories though. :dunno
Personally, I think the whole things is pants. Moreso in this day and age of non-CRT monitors and the onboard controllers that run them. There is a manufacturer that builds 15Hz compatible LCD monitors for obscene prices, so it's not an impossible feat.