Main > Main Forum

PC to HDTV Video Card Recommendations

Pages: << < (2/3) > >>

crashwg:

take a look around www.byopvr.com

bishmasterb:


--- Quote from: gonzo90017 on December 28, 2007, 09:19:54 pm ---What's wrong with using the vga input?

--- End quote ---
Nothing necessarily. In my personal experience on desktop displays, DVI tends to give a better, cripser picture with less noise. This probably varies greatly depending on the display, cable, cable lengths, etc., but I'll probably just play it safe and go DVI/HDMI.


--- Quote from: MonMotha on December 28, 2007, 11:42:07 pm ---I have yet to find a PC video card with DVI that can't do 720p off the DVI output.  Some older cards can't do 1080p, but anything in the nVidia FX5000 series or newer should be able to hit any ATSC mode other than 480i (standard res) on the DVI output (and some can even do 480i).  I've tested at least a FX5600 Ultra and a FX5200.

In other words, grab whatever is cheapest that meets your needs.

--- End quote ---
That's what I did roughly two years ago, and the DVI output on the card I chose (the cheapest :) ) didn't handle HDTV resolutions, and the picture ended up scaling and looking absolutely horrible (unusably so).


--- Quote from: MonMotha on December 28, 2007, 11:42:07 pm ---You'll need to find some way to force that mode choice.  Depending on how your TV reports capabilities, Windows may not offer what you want.  You can use powerstrip or a modeline in Linux.  http://www.linuxis.us/linux/media/howto/linux-htpc/video_card_configuration.html has all the timing information you could possibly need to hit a mode your TV will support.  While most TVs will scale just about anything, but it's always best to hit a native mode, so pick whatever is native on your display.

--- End quote ---

Windows support is required. And avoiding scaling and other nonsense that'll interfere with picture quality is my main concern.


--- Quote from: MonMotha on December 28, 2007, 11:42:07 pm ---Picture quality is excellent.  It's as good as you're going to get out of your display, assuming you use a native mode, not something scaled.  Be sure your TV is set to overscan the display.  It'll crop off the edges, but if you ask it to crush to fit/underscan, the scaler will be invoked and weird artifacts can crop up.

--- End quote ---
I've never seen any setting regarding "overscan" on either of my HDTVs. Is that a TV setting, or a videocard setting?


--- Quote from: SpeedEng on December 29, 2007, 11:52:53 am ---2 cards I have used in my htpc

geforce 6200
able to push 720p (99% of the times)

geforce 6600
720p without a problem

--- End quote ---
Would you call the video quality of those cards excellent? As good as a PS3 or highdef DVD player for instance?

Do you know if those cards support 1080?


--- Quote from: crashwg on December 29, 2007, 12:16:36 pm ---take a look around www.byopvr.com

--- End quote ---

Thanks, will do.

tristan:

You can set the scaling on nvidia cards in the display settings/advanced/nvidia control panel, as long as it detects the monitors abilities properly. I am using a 7600 and it works at HD resolutions just fine on my Sony Widescreen via DVI.

MonMotha:

The GeForce 4 and older stuff was limited to I think 1280x1024 by dot clock on their DVI outputs, which should be sufficient to hit 720p.  I'm not sure what card you were using, but most PC modes are reasonably close to 720p.  I know that getting these modes out of Windows often a ---smurfette--- and a half, which is why I prefer using Linux as it (almost always...they're screwing this up slowly in the name of "ease of use") does what you tell it to.  You should be able to do this in Windows with a tool like PowerStrip, and I think Windows will present the resolutions to you if the other end identifies as a TV.  I've not used ATI gear (it doesn't work worth a darn in Linux, so it's practically useless for me), so I can't really comment on their abilities.  If you want the gold standard as far as video output options go (with great 2d and utterly lackluster 3d performance), go grab the highest end Matrox "G" series you can find.  Those suckers can do just about anything.  All that said, you should be able to do this just fine with any nVidia card on the market including the free after rebate 5000/6000 series stuff that pops up from time to time.

One thing to remember is that many TVs (at least that I've found) do not correctly report their capabilities to PCs via the DDC over DVI or especially the "VGA" input, but will of course happily accept TV timed modes.  This can lead to configuration hassles, but awesome results once you overcome all your software acting smarter than you want it to.

As far as overscan and such, there's usually a setting on your TV.  You want the setting that makes it behave like there's a DTV Tuner/Set top Box on the other end of the line.  This will chop off the other 5-10%, but result in no scaling assuming you're feeding it a native resolution.  This is usually the default unless the TV thinks a PC is hooked up to the other end.  Most TVs also offer a mode where it scales things down by 5-10% to make what would normally be overscanned visible again.  This can be useful on PCs since PC monitors are traditionally adjusted so that the edge of the imaged area is right at the edge of the viewable area of the monitor, but on fixed resolution device like most TVs, has to be done by scalers and this introduces visual non-niceties.  Some TVs will default to this option if you send them video in a known standard PC format "for your convenience".  Do not confuse all this with the "overscan" option found on old-school "TV out" (480i composite/s-video outputs).  It's a pretty different beast.

As far as visual quality, DVI does tend to do better, but not entirely for reasons you might at first thing.  Most modern TV technologies such as DLP, LCD, and plasmas have discrete pixels.  Old-school CRTs do not.  In order to address these discrete pixels on the screen, discrete pixels have to be recovered from the input signal if they are not present.  Analog video has discrete lines, but no discrete pixel timing.  DVI/HDMI (same thing, different connector, basically) have each pixel clocked discretely by their very nature.  This timing recovery process needed for analog video is complicated and expensive/difficult to get right.  TV/monitor makers get better at it all the time, but if you avoid the process entirely, results are often better (and certainly no worse).  Of course, there's also the issue of noise and such on the analog video line, and you'll have the same problem with hitting native resolutions (often more, since those "VGA" inputs are really designed only for PC video and make wild assumptions about horizontal resolution and pixel aspect ratio that aren't correct when you're feeding in native video).

SpeedEng:

yes the 6600 is as good as my ps3

im limited on my plazma @ 1366 x 768 (im running 1360 x 768 any higher causes overscan)

1080 not sure if the 6600 has enough umph for it

Pages: << < (2/3) > >>

Go to full version