@ZSNES: Try this one ->
http://files.arianchen.de/zsnesw.zip <- for some reason you still will have to play with the settings a bit.
@old consoles...
There a common mistake.
NTSC does NOT mean "fixed 640x480 4:3"
1. A single scanline takes the same time, whether it has 512 pixels or just 256.
2. NTSC is defined 525 lines (some of them are sync and overscan) at 29,97Hz and sync times.
Now as no conventional TV can display "just" 29,97Hz, and some other historical limits, the images gets "interlaced". Interlaced means you only see one half of the image per frame, but with twice the framerate.
You'll get 262,5 lines at 59,94.
3. Most old equipment uses Composite-Cables (yellow RCA) or even fed their signal over an RF antenna cable which "hurt" the signal clarity.
4. TV Tubes usually have a so called "overscan" area, which is used to compensate for non "square" tubes, and usually does not contain vital data.
5. Pixels are NOT square on a CRT!
That being said...
Most classic consoles/homecomputers used only a resolution of 200 to 240 "active" lines with the rest being sync an overscan.
There are several reasons to do so, mostly because of the video CPUs only were 8 bits, didn't have too many power and well memory was quite expensive.
So lets take a look on the NES, though the technical details most certainly are not correct.
Lets say the NES has an active "resolution" of 256 pixels per scanline. So we know we need some "sync" and "overscan" pixels too, and we end up 320 "real" pixels.
As for the lines, we know that NTSC "forces" us to use 262 lines, and we end up with 240 lines "active" and some other for sync and overscan.
Now there we have a "real" resolution of 320x262 pixels, but we only see the "active" part of 256x240 pixels on screen.
Pixels in a single frame -> 320 pixels * 262 pixels = 83840 pixels
So now we these pixel data gets send down the wire 60 times per second -> 83840 pixels * 60 frames per second = 5030400 pixels per second.
Thats called the "pixel clock" and usually is measured in Hertz -> 5030400 Hz = 5030,4 kHz = 5,0304 MHz
Now in many, if not all cases, the image consists of pixels, which itself is based on three "primary" colors -> Red, Green, Blue
In a composite cable, there are only two wires, one being the ground, the other being the signal... so these 3 color "channels" have to be mixed together.
That's the first time we "degrade the signal". Now this combined signal has to be sent down the wire, which itself causes some degradation to the signal.
On RF Leads theres also Audio signal coming down the wire!
Last but not least...
The whole mess needs to get seperated again in the TV to drive the three "guns" in your TVs tube.
Thats causing "color bleeding" and some kind of "unsharp" image. However that never was a real problem on that lowres consoles.
Newer systems, lets take the Amiga as example, had faster chips, and more memory.
On an Amiga you can select various resolutions with 320x200 'lores', 640x200 'medres' and 640x400 'hires' being "standards".
However the Amiga is still "bound" to NTSC limits.
So it the time a single line takes in 320 pixel mode and in 640 pixel mode is exactly the same. The only difference is the time a single pixel is shown.
Remember... Pixels are NOT square on a CRT!
So if we want to show a 200 line mode (the other lines are usually filled with the background color), we still output 262 lines 60 times a second.
And if we want to show a 400 line mode (real resolution would be 525 lines), we output all even lines first, then we output all odd lines, 30 times a second.
Some systems, like the PlayStation used several resolution changes.
The "Sony Playstation" Logo for example is shown 640x480 line mode, you can notice the flickering.
Most games used lower resolutions like 512x240 or even 320x240. Depending on how fast the game could render.
Another fine example would be a classic VHS tape versus a DVD.
Both output to a standard TV in NTSC, but most likely the DVD will have a much better image, as VHS tapes barely can do more than 320x240 "pixels".