Main > Monitor/Video Forum
NVidia TV-Out Questions and Problems
Trenchbroom:
I recently purchased an 8 MB ATI Rage 3D LCD videocard for $30 new from a clearance house. For my Celeron 400 MAME cab box it is fine and the output is very good with my 20" TV. My question is this: is there a real performance increase for choosing newer ATI or Matrox cards over the older ones? Understanding, of course that the older cards will force you to probably use Win9x, take up a PCI slot, etc.
Everyone talks about Radeons for MAME cabs, but since MAME is so processor-intensive I'm just curious if anyone has experienced a videocard bottleneck by using an older card with 1GB + processors.
Might have to bug my friend to let us do a little experimenting with his 1800XP.
Rick Osborn:
I'd have to agree with anyone who says that NVDIA cards suck for TV out. ATI is the only way to go.
I was just finishing up a MAME cab yesterday and was using a Geoforce 2 card that I had lying around. I also had an ATI Rage 128 around too. But, I figured I'd use the NVIDIA since it had 64mb RAM and the PC would be used for PC games and not only MAME.
Well, the TV out was black & white only and there was the black border you mentioned. No options changes would get me color either.
I installed the ATI and was back in business.
I recently upgrade my personal cab from an ATI Rage 128 to an ATI All in Wonder (which is why I had a spare ATI card). I did notice a slight improvement in picture quality on the All in Wonder over the Rage. Something to consider since those cards are very cheap.
Rick
--- Quote ---I have two small problems that are fairly common, though the fixes don't seem to be working with me.
1). When running s-video from GF2 to TV the picture is black and white. I've seen people complain about this, and all they've had to do was in Device Selection set the Output Device to S-Video from Auto-Select. Unfortunately it stays black and white for me.
2). Black borders around left and right side of TV image. I have about 1" ob black on either side of the image on a 32" TV. I've been to the TV MAME page (http://www.trouble-makers.com/kami/emulation) on custom resolutions, and added them fine, but nothing changed at all.
I don't know if it's me or XP or what. I know XP doesn't allow windows to go into any resolution less than 800 x 600. Also, if I launch a SFA (which uses 384 x 224) while connected to my monitor, it still ends up displaying at 800 x 600. Is there some command-line setting I should be adding? I have a feeling it's still being sent at 800 x 600 to the TV, since the video barely flickers or changes size from windows to MAME, however a full screen command prompt utilizes the whole screen.
Testu, I know this is your department, and I'd appreciate any advice you could give. I know you're down with the TView scan converters, but I thought you were able to tweak the black borders with just the hacked resolutions into the NVidia drivers.
Thanks.
--- End quote ---
Lilwolf:
Before everyone trashes the TVout for NVidia...
remember... NVidia makes chips, not video cards. Each company that make/sells NVidia cards use their OWN video out chip.
So the answer is... some are GREAT! but many SUCK!
I've seen pictures of some that are as good if not better then ATI's.
Trouble is... which is good? The VIVO ones use the phillips chip which has great output... but the new Ti4200's use a new phillips chip that was produces to reduce cost... is it good? who knows.
If you go NVidia, make sure the company you buy from is local and allows for returns... Or get an exact copy from someone who had good luck!
or...
go ATI... and you don't have to worry (dont get the XPert... I hear they had a different chipset or somethign)
tetsu96:
There's been a lot of good replies and opinions on this thread. I'll try to sum up the most important points.
1.) No matter which video card with TV-Out you pick up from which maker, you'll need to use Hardware Stretch for most games likely. This bothers some people more than others, you'll have to decide on your own. If you want to run all your games at their native resolutions without any stretch filtering / effects, you'll need either a good scan converter, a RGB connection from the video card straight to the TV, or an Arcade Monitor (again, using RGB).
2.) If you decide HW stretch is not so bad (or can live with the speed hit from the -sharp effect), then the quality of the TV outs vary. ATI historically has had great TV-Out for a lot of users on this board, NVidia's had mixed results. The Brooktree chip is not supported well by NVidia but other TV-Out chips might have better driver support. Poor driver support results in underscan (not filling the whole TV), the inability to adjust image position / size / etc.
3.) 2D speed and 3D speed are totally different. GeForces have great 3D speed but they're not the fastest in 2D. Different driver releases have an effect on the 2D speed, and it even affects rendering custom resolutions (which use no stretching effects whatsoever). I've heard Radeons are faster in that regard but I've never tested that myself. I wonder if there's any kind of solid metric which could be used to guage 2D performance...
Personally, I've used a bunch of Video Cards with TV outs and several scan converters. My own opinion (for my cabs at least) is to go with a scan converter that supports a wide scan range, this should let you use custom resolutions which is as good as it gets next to the whole ArcadeOS / AdvanceMAME setup (which is definately the most exact at the expense of ease of use). Of course, your mileage may vary...
Navigation
[0] Message Index
[*] Previous page
Go to full version