Main > Main Forum
S-video (TV) vs CRT VGA display: examples
Xiaou2:
I agree that a decent svideo signal can be attained. Most modern video cards have good svideo out, and most decent modern tvs will have a good svideo input, most especially if the tv has component as well.
I used to have an older Ati pci video card, and the svideo out is completely different from the newer cards. They now have the option to sort of interlace the output, to create a higher definition of quality, albeit with some flicker.(Even without that, the output seemed a lot more clean) Personally, I kinda liked the more blended look of the older card... even if it wasnt as accurate. Basically, it looks like mame on a pc monitior, and not an actual arcade monitor
Which brings me to the other point...
While hooking up a SNES to my Sony XBR 34" 1080i 16:9 TUBE TV (ohh yes, its Sweet ;D, but Big & Heavy!), with both composite and svideo, I literally have to use the composite input because the svideo is too clean for those games. The edges and lack of blending is just way too harsh. The more muddied up svideo signal is pretty much how these games were designed to be seen / displayed. I found this also to be the case on many PS1 games as well. They looked wretched on the better quality svideo, but acceptable on the lower quality composite signal.
Also, as stated, Dragons Lair is a terrible example. The laserdisc player is a lowly composite signal, getting converted to RBG, and it wasnt exactly defined or sharp. Definite blur and losses visible.
AND, taking a picture of a monitor with and without flash is a HUGE difference. I know, because Ive done the same thing, and the resulting images were completely different. It effects the color and clarity, much more so than an LCD or modern high def pc monitor, because the way the thing is designed. (larger pixel triads, larger shadowmask lines, and phosphor visible when drowning the tube with flash of light - grayish look)
Edit:
Finally, many TVs have indivdual setting configurations for the various inputs, and or you Need to adjust them to compensate for the differences in signal types. Boosting the contrast alone would probably do the trick to matching the two output images. (Possibly lowering the brightness, and boosting color output as well)
Some TVs also have enhancement options which may also be altering image differences too.
leapinlew:
--- Quote from: BigBadOl'MeanXiaou2 on November 04, 2011, 10:08:40 am ---
--- End quote ---
You guys done screwed up now...
Jack Burton:
My own two cents:
You guys know that my personal arcade display is a Mitsubishi AM-3501R presentation monitor.
This monitor is great because it has a fairly rough dot pitch, and uses chroma clear technology so it's extremely similar to a classics arcade monitor. It's probably made up of much finer components, and is possibly slightly finer pitched, but it's very close. Much closer than many arcade monitors you can buy today, and miles ahead of using a desktop PC CRT.
So, this monitor has both S-video, and RGB inputs on BNC connectors. So it's very easy to compare S-video vs RGB.
A while back I got involved in a score competition here on the forum. It was for a game called Twin Cobra, an old school shmup. Now, I played mostly on MAME, but I also became aware of another version of the game that is out there. It's available on the playstation as part of a compilation called Toaplan Shooting Battle.
The cool thing about the games that are available on Toaplan shooting battle is that they are in native resolution. They display the games in 320x240 just like a real PCB.
So, by connecting my playstation via S-video to my monitor and my computer via the RGB inputs I was able to compare the video quality from both of them on my Mitus monitor.
A word about that monitor. It doesn't very much, if any at all filtering applied to the video. There is no comb filter, and there is a button on the front that is labeled as "notch" that simply blurs the screen when it's pressed in. I believe it's a type of noise reduction filter.
The result of this is that when you display composite or S-video sources on this monitor as compared to my 36" toshiba CRT tv next to it the Mitsu screen will display much more color bleeding, artifacts, fringing, etc. It's showing the raw video signal with very little video processing going on. Of course I also turn the sharpness setting all the way down.
So how did the 240p S-video compare to the 240p RGB?
Extremely favorably. It was easy to tell the difference between the two of course, but the S-video trh still looked awesome. Very clean lines, almost no color bleed, and good blacks and whites. The only real thing that was noticeable was a bit of fringing around all the objects on screen. And the screen was much brighter and contrasty. Overall the image was quite a bit softer than the RGB, and some very bright color gradients were lost.
The RGB image was of course razor sharp, but actually looked a bit "flat" compared to the S-video. The color gradients were completely defined and could reach levels the S-video couldn't without causing blooming. And black was black
The key thing to take away from this though is that the S-video coming from the playstation was still A+ in quality. It would still beat 90% of arcade monitors I've seen in the wild.
I think as long as you have a display with the right dot pitch, color temp, and you can preserve native resolution/levels you're always going to have a very nice accurate arcade image no matter the video source is s-vid/component/rgb
You know I also have an RCA lyceum TV that I fooled around with and re-calibrated for fun. Just for kicks I hooked the psx up to it via the BNC composite video port.
The image on it looked really f'ing good too. Maybe even composite isn't so bad after all in the right situation....
But I guess that's a discussion for another thread >:)
Gray_Area:
I have compared RBG to S-video on the same monitor. My Monivision. And S-video looked like what I posted above. And I'll re-state, both sets of images (his and mine) were of Daphne - not DVD, let alone Blue-Ray source, if you're familiar.
And, Randy, my monitor is '06. I think his TV may be at most two or three years older. On that note, my monivision I think is '99.....
As for 'monitors I've seen in the wild', I will return perhaps in a few weeks with some images to address that.
leapinlew:
--- Quote from: Gray_Area on November 05, 2011, 02:33:50 pm ---I have compared RBG to S-video on the same monitor. My Monivision. And S-video looked like what I posted above.
--- End quote ---
Try a different s-video cable...