One thing people seem to be ignoring is that the 2600 only had RF out, and it wasn't anywhere near as good as the RF converters you see today. The tuners on TV's were pretty poor back then as well, with a lot of folks still using tube sets with the channel changer that went "clunk, clunk, clunk" when you turned it. I'm an "old guy" and I was all of 12 when it came out. Just out of curiosity, genesim, how old were you? If your first experience with the machine was 10 years after it was introduced, you were likely viewing it on a better display than most had when they played it hours upon hours each day.
As for seeing "hair" on an old TV with an RF broadcast, I'm not so sure one really could. You could tell it was supposed to be hair, but it certainly didn't have enough definition to see strands. Hell, Hi-Def is the first time one could really experience that level of detail. Also keep in mind that moving images, long persistence phosphor coatings and interlaced images tended to hide a lot of the clunky nature of the display. The first time I saw one of those VCR's or computer cards with the ability to snag and display a raw field of NTSC video, I was flabbergasted. It looked absolutely terrible. Blocky, noisy, etc. But this was very representative of the signal. The fact that your brain averages everything tends to make things look better than they would if your brain was sensitive to each detail in that tiny moment of time.
You can believe it didn't happen if you want. But on my very used $150 10" color TV, it most certainly did. It was a very different world, my friend.
RandyT