Main > Main Forum
Interesting article on making LCDs look like fuzzy CRTs
<< < (16/23) > >>
DJ_Izumi:
Er, yeah, actually, NTSC is entirely interlaced, always has been, progressive television display and broadcasting is a new revelation.  It's in relation to the 60hz electrical grid where the video signal is traveling at 60hz it was easier to take that 30 frames per section and cut each frame into an even and odd arrangement of the entire picture.

However the signal wasn't necessarily progressive either.  That is, one odd and one even frame don't necessarily equal one single progressive frame when combined.  A lot of video cameras capture every half-field independantly and results in a hypersmooth 60hz video signal.  (You see it noticably on camcorders and shows like 'Cops'.)  That's also why home console and arcade games run typically at 60fps so they could fit that signal perfectly.

That said, interlacing really doesn't have an effect on the blurring and bleeding you see on an NTSC TV.  A lot of the blurring is from the signal.  Take a PS2 and connect it to a TV with composite and component input, the component is brilliantly improved and probbly as good as an RGB connection to an arcade monitor.  (OR at least very few would precive a difference).  Far, far, far less noise, burring and bleeding.

Arcades have almost exclusively used clean, sharp RGB connections for their monitors where as home consoles only in the last few years started coming with anything better than the much lower resolution composite.
genesim:
Xiaou2,

You are so shrouded by emotion that you cannot seem to have a conversation with me without putting me down.

I comprehded just fine and I am talking about the Atari 2600 first.  Not to win an arcade battle, but to first get simple facts straight.   

Turbo I agree with you on, Cool Spot I agree with you on...MOVE ON.  Repeating what we agree on is not moving the conversation forward.    If you say black and I say black you hammering into my head that I didn't say black doesn't make sense.   I agree I agree I agree I agree.

Though I don't like the hot pink or the blurred display and I can stand the stripes...but I do see your point.


--- Quote ---SO?  You really think they were pumping out Interlaced TV in  1950?!!!  Have you even SEEN a TV from 1950?!   
--- End quote ---

No, not the 50's...pre 70's YES.   You didn't read the article and learn your facts.  Start there then you will be more developed and educated for a debate with me.


--- Quote ---In fact... almost NO arcade games that I know of are interlaced.
--- End quote ---

Arcades don't use NTSC standard...or if they do, I don't know of any.  Who argued otherwise?   NTSC is a standard...not the ability to do or other such nonsense.   You seem to be confused again..please read the link.   It will help you.   


--- Quote ---LOGIC?!  You have NONE!
--- End quote ---

More putdowns.   I am being patient though.  Of course you say no logic then you get onto me because I call them programmers.  Same diff.   Its all symantacs.

I am speaking of them as a whole.  Graphic designer to final coders.   Put them in a basket and shake them up.   It is the intent that matters, not the words.

DJ_Izumi,

Thanks for pointing out the obvious, though I digress with you.   The PS2 is a poor example because its resolution surpasses the capability of an NTSC display in its native form.   Any interlaced up a progressive signal gives color bleed ...IF the display has more resolution.   Less...and the repeated display will be all the display, so bleed is minimal and not enough to be designed for.

PS2 doesn't have a RGB connection as far as I know...one of the things that makes me angry actually.

 
Xiaou2:
My Mistake.

 Growing up with several computers such as the c64 and Amiga...
Interlaced modes were around 640*480  and above.

 While standard tv may have used interlace to display... I doubt it would have
used a resolution anywhere near 640*480.   My  480i tv can only output that res.
 
 Tekken III can use an interlace mode.   Its either 640*240  (standard default)
 or at interlace,   its  640*480.
 
 Genesis at max res didnt come close to that... and tv back in the atari days..
well,  we know it wasnt broadcast in  640*480.

 So, is it that standard tv used to be a resolution of 320*240... but was drawn in
two passes - to equal 110*120  lines each pass?
genesim:
Thats the first time I have seen you admit you were wrong to me.

Now if I could just get you to understand that the phospher in the backlight of a LCD is not the color mechanism but rather an illuminating agent unlike the CRT and Plasma which it is integral to the color mechanism...we are good to go.

Notice I didn't call you an "idiot" or "moron" or anyone who thinks like that has "half a brain" or any such other putdown.   I would rather discuss politely and I do not think I am without guilt...rather I am trying to bring this together.   It is much easier if I have others cooperating.

As for the Genesis, I don't know exactly how it works, but that seems reasonable.

But at any rate that is not the important part.  My point is that one pass of the resolution is less then the full pass of 320 so at no point are you seeing all the resolution.   Hence you see blur...distortion...jaggies...etc.

The Atari 2600 has full resolution in each pass no matter how you put it.   Hence the clearer picture...barring RF and other such problematic interference.

But designing for this?  I don't believe it in Atari's case for one single solitary second.   How could it be if the display module(even in one pass) surpases the code?
genesim:
By the way I too grew up with the C64 and it is one of the best home machines in classic period ever.   Archon is one of my favorite games and the Nintendo version was pure crap compared to it.

Spy Vs Spy also didn't translate well.   So many cool games.
Navigation
Message Index
Next page
Previous page

Go to full version