Build Your Own Arcade Controls Forum
Main => Monitor/Video Forum => Topic started by: Ummon on June 08, 2008, 11:11:32 pm
-
edit
-
Wire colors are entirely arbitrary. There are some customary designations, but not all manufacturers use them. Your best bet is to just tag each wire using a multimeter.
-
edit
-
Okay, I measured the wires. Going by the standard vga pinout, I get the following values:
red around 1v
green around 1.2
blue around 1.2
sync both are around 12v I thought sync was supposed to be small and negative?
All the rest appear to be grounds or not used. I'm going to tie the syncs together so I wanna make sure I don't blow something. I couldn't find sync voltage for vga. Do those values sound right?
-
bump
-
Don't measure the signals, but rather use the continuity or resistance test setting to check each wire to see what pin it corresponds to. Then you'll know exactly.
Due to the AC (and non-sine) nature of these signals, they cannot be accurately measured with cheap multimeters, and even the measurement you'd get with a true RMS multimeter wouldn't be particularly useful.
For reference, VGA video signals are 0.7V peak (full white) when terminated into 75 ohms. When terminated into Hi-Z like a multimeter, these will read twice as high.
The sync signals from a PC are both conventionally active high TTL levels, so about 5V high, 0V low.
-
Never mind. I bit the bullet and hooked it up. Essentially I wanted to see if I got the same results as with that RGB to VGA connector, and I did. So if I want to run different resolution games, I'm going to have to replace the monitor with a multisync.
-
For reference, VGA video signals are 0.7V peak (full white) when terminated into 75 ohms. When terminated into Hi-Z like a multimeter, these will read twice as high.
The sync signals from a PC are both conventionally active high TTL levels, so about 5V high, 0V low.
Is there a technical reason why a PC uses conventional TTL levels for the sync, but uses 0.7V for the video signals? Why didn't the designers just use 5V across the board or 0.7V across the board?
-
5V is standard "TTL" level, which makes it convenient to drive straight from the outputs of the (then common, though not so much anymore) 5V logic that generates all the timing for a video controller. These timing signals really are just digital signals, and many monitors immediately feed them straight into other digital logic, especially on modern "digital" monitors. Keeping them at 5V high levels removes the need for a bunch of unnecessary (read: $$$) level shifting.
The choice of 0.7Vpp for the video signals is an interesting one. There are several legacy reasons as well as technical ones. The legacy concerns probably stem from the technical ones. The biggest reason I can give is that if you AC couple 0.7Vpp analog video (i.e. run it serially through a capacitor to prevent a path for DC bias), which is a commonly required operation, the circuit to restore the DC levels is very simple if you use 0.7V as the peak level since 0.7V is what a silicon diode tends to develop when it is "good and on". Conventionally, signals with embedded sync (e.g. sync on green or composite NTSC) tend to use -0.3V as their sync level, 0V as blanking, ~0-0.1V (depends on the standard) as black, and 0.7V as full white.
Also, the 0.7V is after the 2:1 divider of source/termination 75ohm impedance. So the actual top rail has to be a little higher than 1.4V. If you wanted 5V signals after the divider, you'd need a 10-12V rail. While computers have these, circuit designers like to avoid using them outside of the power areas for various reasons that aren't really worth delving into here.
Keeping the signals small also helps with high bandwidth (high resolution) signals due to the limited slew rate (wikipedia it) of the analog output buffers. 0.7-1V is a good compromise between practical slew rate limitations ("back in the day") and noise immunity on shielded coaxial cable.
Basically, there are several reasons to keep the levels different and basically no reasons to make them the same, so the choice was made many years ago by VESA to make it the way it is today.
Of note, the old EGA and CGA standards put out by IBM used TTL level video because the computers didn't generate the video using digital to analog conversion. Instead, each R, G, and B line was hooked up the output of a TTL driver and "3-bit" color was used (CGA reduced this to 2bpp and used a pallette to generate the RGB signals) allowing each of Red, Green, and Blue to simply be either on or off. Some other implementations allowed e.g. two different levels through the use of two output drivers per channel and a resistive divider, again driving the divider directly off a TTL output. This kept the PC cheap at the slight cost of sometimes making the montor more expensive.
Modern standards at higher resolutions and color depths require IC DACs and are subject to all the aforementioned stuff.
Sorry about the length, but you asked a technical question and got a technical answer :)
-
Thanks! I've always wanted to find a good explanation of this stuff! :applaud: