Main Restorations Software Audio/Jukebox/MP3 Everything Else Buy/Sell/Trade
Project Announcements Monitor/Video GroovyMAME Merit/JVL Touchscreen Meet Up Retail Vendors
Driving & Racing Woodworking Software Support Forums Consoles Project Arcade Reviews
Automated Projects Artwork Frontend Support Forums Pinball Forum Discussion Old Boards
Raspberry Pi & Dev Board controls.dat Linux Miscellaneous Arcade Wiki Discussion Old Archives
Lightguns Arcade1Up Try the site in https mode Site News

Unread posts | New Replies | Recent posts | Rules | Chatroom | Wiki | File Repository | RSS | Submit news

  

Author Topic: Native interlace and mainline MAME  (Read 2317 times)

0 Members and 1 Guest are viewing this topic.

Recapnation

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 332
  • Last login:December 01, 2023, 07:39:55 pm
    • Eiusdemmodi
Native interlace and mainline MAME
« on: December 04, 2019, 07:00:31 am »
This is a split-off from this conversation, which actually has little to do with Switchres updates and GM development.

The latest MAME version (0.216) adds some new rendering modes in an attempt to simulate an interlaced display on flat panels. Oddly enough, it does it just for Popeye (Nintendo, 1982), despite being dozens of arcade games and home systems which used interlacing. This is not really relevant for native mode CRT users, but has served (I believe) to bring back an old topic which seems unsolved/undocumented but actually may have relevance for everybody -- pixel smoothing coded at driver level:

I dont know how much frames MAME outputs to compensate interlace, it could be 60 progressive frames or it could be 30 progressive frames. The main point is, that if the original game has interlaced output, then the fields are blended for progressive output.

EDIT: That is how MAME handles interlace and this process is destructive, till today there is no solution or blending method, that can make this process non-destructive/revertable. Once a blending/conversion is applied, there is no undo or reverting process. What is more worse is, you lose the resolution and half of the frames. This is the reason why no blending/conversion looks as good as the original interlaced frames, provided you watch this on a CRT.
One of the biggest problems for the MAME devs is, that no one use a CRT as a display, ergo they have nothing to proof with. Interlaced material can only be properly displayed on a CRT, which makes it impossible to show MAME devs a evidence, what kind of data is really lost. This may explain, why they did not care about interlace that much. What MAME needs, is a switch where a user can define what kind of output is wanted (something similar to GroovyMAME´s monitor switch). Currently all interlaced stuff in MAME, is postprocessed with blending.

So i am not a game-expert nor do i know all interlaced games, but if a game has interlaced output and it is in use other than some resolution switching for in game menues etc., then there will be data/motion lost, if outputted through MAME (even if MAME has 60 blended, progressive frames). One example i posted years ago, is Atari 2600. Play H.E.R.O and press pause in a scene with lot of motion and you will see a blended progressive frame, that originally is not there.

If someone here knows good examples for interlaced games, please tell us more :) . Games that really use the advantage of interlacing. A bad example would be a game that internally creates progressive frames and the output is interlaced, which is possible but without any benefits you usually would have. One great example is the Laserdisc game Firefox. It uses even interlaced real life footage as a background for the game.

There is more to write on that topic, but basically it is this. If there is interest, i can write more for it.

It's my understanding that U-Man is referring indeed to this matter (let's hear him, though), which had a discussion thread at Bannister's in 2013, no less:

Quote
It's in the driver.

http://git.redump.net/mame/tree/src/mame/video/tia.c

Note the second half of the 256-color palette is half the brightness of the first (2600 has only 128 unique colors) and that it allocates 3 framebuffers in device_start. Starting at line 772 is where it blends things against the previous frame.

https://forums.bannister.org/ubbthreads.php?ubb=showflat&Number=85324&page=1

Sadly the pics are lost. The author's answer is this:

Quote
Wow, I added that a loooong time ago.

The reason for this was that there is one game that enables 2 different sprites on alternating frames (I don't know which game anymore, it might have been invaders). The standard code would produce a lot of flickering, but smoothing things a bit like this made the game a bit more playable. Also when taking a screenshot you would actually see both sprites wink

The slightly blurry effect also made it look a bit more like an old tv wink

Of course proper emulation would be to have the pixel decay handled by something in the core, or by something like hlsl which of course wasn't available at the time.

I have no problems removing the code for combining 2 frames, just be aware that you can then run into a few games where you experience more obvious flickering of sprites.

So is it known which drivers are coded like this (seems unlikely that this is MAME's universal behaviour for interlaced graphics, as U-Man implies)? Are we sure that this "feature" hasn't been totally removed after this 6-years period, given that the devs there are clearly against having it now there are shaders?

Other than that, a game with interlaced graphics should look the same through MAME against the real thing, though you must be sure there isn't an anti-flicker filter on your system (for pre-NAOMI games, that is).