edit: **argh....I meant this to go in the other forum, but maybe this is close to the subject matter...more n00b mistakes....**
I've had some experience with the whole vid card/ display issue from helping a friend build a home theatre PC, but we have run into a really annoying problem. I'd like to duplicate his setup for a cabinet if this can be fixed.
Long ago when the PC was first built, we used an old scan converter to display the PC on a TV. After a hardware upgrade, he went with an nvidia ti500, using the svideo display. A much better result. Next we borrowed an ATI Radeon, and found (as everyone knows by now) that it looks better. He then bought ATI's daughter card that offers component video out. Never got it to work right. Now he's on an ATI 8500 and generally pleased, but if the PC powers off or crashes, the card always defaults to the VGA output. That means he has to drag a monitor into the room, fight the cabling nightmare to hook it to the PC, boot up and then tell the card to use the TV output. We can't seem to get it to always boot up to that ouput by default, and if you don't hook up a monitor, you get no display. I have read here that some people are using ATI cards, so do you have this problem? What's the fix?