The GeForce 4 and older stuff was limited to I think 1280x1024 by dot clock on their DVI outputs, which should be sufficient to hit 720p. I'm not sure what card you were using, but most PC modes are reasonably close to 720p. I know that getting these modes out of Windows often a ---smurfette--- and a half, which is why I prefer using Linux as it (almost always...they're screwing this up slowly in the name of "ease of use") does what you tell it to. You should be able to do this in Windows with a tool like PowerStrip, and I think Windows will present the resolutions to you if the other end identifies as a TV. I've not used ATI gear (it doesn't work worth a darn in Linux, so it's practically useless for me), so I can't really comment on their abilities. If you want the gold standard as far as video output options go (with great 2d and utterly lackluster 3d performance), go grab the highest end Matrox "G" series you can find. Those suckers can do just about anything. All that said, you should be able to do this just fine with any nVidia card on the market including the free after rebate 5000/6000 series stuff that pops up from time to time.
One thing to remember is that many TVs (at least that I've found) do not correctly report their capabilities to PCs via the DDC over DVI or especially the "VGA" input, but will of course happily accept TV timed modes. This can lead to configuration hassles, but awesome results once you overcome all your software acting smarter than you want it to.
As far as overscan and such, there's usually a setting on your TV. You want the setting that makes it behave like there's a DTV Tuner/Set top Box on the other end of the line. This will chop off the other 5-10%, but result in no scaling assuming you're feeding it a native resolution. This is usually the default unless the TV thinks a PC is hooked up to the other end. Most TVs also offer a mode where it scales things down by 5-10% to make what would normally be overscanned visible again. This can be useful on PCs since PC monitors are traditionally adjusted so that the edge of the imaged area is right at the edge of the viewable area of the monitor, but on fixed resolution device like most TVs, has to be done by scalers and this introduces visual non-niceties. Some TVs will default to this option if you send them video in a known standard PC format "for your convenience". Do not confuse all this with the "overscan" option found on old-school "TV out" (480i composite/s-video outputs). It's a pretty different beast.
As far as visual quality, DVI does tend to do better, but not entirely for reasons you might at first thing. Most modern TV technologies such as DLP, LCD, and plasmas have discrete pixels. Old-school CRTs do not. In order to address these discrete pixels on the screen, discrete pixels have to be recovered from the input signal if they are not present. Analog video has discrete lines, but no discrete pixel timing. DVI/HDMI (same thing, different connector, basically) have each pixel clocked discretely by their very nature. This timing recovery process needed for analog video is complicated and expensive/difficult to get right. TV/monitor makers get better at it all the time, but if you avoid the process entirely, results are often better (and certainly no worse). Of course, there's also the issue of noise and such on the analog video line, and you'll have the same problem with hitting native resolutions (often more, since those "VGA" inputs are really designed only for PC video and make wild assumptions about horizontal resolution and pixel aspect ratio that aren't correct when you're feeding in native video).