Main > Main Forum

What type of monitor connection is best?

Pages: << < (3/3)

wwwombat:

So... on an LCD monitor, when given the choice between a VGA connection or a DVI connection which is best for gaming? (particularly of the mame or visual pinball variety although I do still play PC Games like Call Of Duty 4 on it)

From the manual I read it that if I use the DVI connection then I'm locked into a 60HZ refresh rate. But if I use the VGA connection then whilst I still am still locked to 60Hz for the monitor's default 1920x1200 resolution, if I shift down the scale a bit then I can get higher (70-75Hz) refresh rates.

RandyT:


--- Quote from: wwwombat on March 02, 2009, 01:00:23 am ---So... on an LCD monitor, when given the choice between a VGA connection or a DVI connection which is best for gaming? (particularly of the mame or visual pinball variety although I do still play PC Games like Call Of Duty 4 on it)

From the manual I read it that if I use the DVI connection then I'm locked into a 60HZ refresh rate. But if I use the VGA connection then whilst I still am still locked to 60Hz for the monitor's default 1920x1200 resolution, if I shift down the scale a bit then I can get higher (70-75Hz) refresh rates.

--- End quote ---

LCD's are weird birds when one starts looking at refresh rates and resolutions.  DVI is the purest connection you will get to an LCD panel.  The probable reason for higher refresh rates on the VGA port is that the device inside doing the analog to digital conversion will accept those rates, but that may have little bearing on what the panel displays.  If the DVI interface locks you at 60hz, it's possible that the panel will only ever actually display at 60hz, even though you may be feeding it something higher before the conversion.  Resolution is the same way with an LCD.  All images (!) are displayed at the native resolution of the LCD.  Such is the nature of digital displays.  You can feed them smaller (and sometimes larger) resolutions, but they always get converted to the native resolution of the panel before being displayed (if shown full screen).

Long story short, there is no short answer for LCD panels, because there are too many variables.  But you might like the slightly fuzzier image you will probably get from the VGA input for MAME, but a sharper image from DVI might be more to your liking for VP or more modern titles.

RandyT

SavannahLion:


--- Quote from: MonMotha on February 27, 2009, 04:43:01 pm ---I have no idea why not, given that all the component inputs on those same TVs will, but they don't. 
--- End quote ---

I found an old article about this once. I'm a little hazy on the details since I was actually researching something else entirely. But I digress, I think the reasoning is that the manufacturers claim it's a cost savings measure in the circuitry. Something to do with not requiring an extra divider/multiplier I think. This is coupled with the capabilities of the CRT itself. To save a few pennies on each monitor?  :dunno

Yet another article stated that 15Hz (same for 30 and 60Hz) is undesirable due to interference (ie flicker) from fluorescent lights. My workplace actually requires our monitors to use refresh rates higher than 60Hz. (I just yank the damn bulbs out from above my office.) I think this is a load of crap as well since very few modern workplace computers are that old as to actually still use anything lower than 60Hz.

I can think of other things that might be factors, but those are all just theories though.  :dunno

Personally, I think the whole things is pants. Moreso in this day and age of non-CRT monitors and the onboard controllers that run them. There is a manufacturer that builds 15Hz compatible LCD monitors for obscene prices, so it's not an impossible feat.

Pages: << < (3/3)

Go to full version