Main Restorations Software Audio/Jukebox/MP3 Everything Else Buy/Sell/Trade
Project Announcements Monitor/Video GroovyMAME Merit/JVL Touchscreen Meet Up Retail Vendors
Driving & Racing Woodworking Software Support Forums Consoles Project Arcade Reviews
Automated Projects Artwork Frontend Support Forums Pinball Forum Discussion Old Boards
Raspberry Pi & Dev Board controls.dat Linux Miscellaneous Arcade Wiki Discussion Old Archives
Lightguns Arcade1Up Try the site in https mode Site News

Unread posts | New Replies | Recent posts | Rules | Chatroom | Wiki | File Repository | RSS | Submit news

  

Author Topic: Input lag VS...  (Read 1333 times)

0 Members and 1 Guest are viewing this topic.

Elaphe666

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 22
  • Last login:Today at 12:31:10 am
  • I want to build my own arcade controls!
Input lag VS...
« on: December 11, 2024, 03:28:11 am »
I'm currently using a gaming LCD monitor (165 hz with freesync), with MAME, no vsync, no syncrefresh, with low latency on and I'm very happy with the input lag (not noticing any actually). I was considering using another computer with a CRT TV, running Windows 10 with an ATI card plus a VGA-SCART cable, and CRTemudriver with Groovy MAME. My concern is: will I get worse, the same or better input lag than using my freesync monitor? I mean using GroovyMAME without frame delay, since I've heard that feature is problematic and has to be configured game by game (I have no time and will for that). I also know it demands a very powerful computer and I only have a i5 10400 and don't want to be again worried about frame loss in demanding games such as CAVE shoot'em ups or STV games. Thank you.

Calamity

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 7457
  • Last login:January 01, 2025, 12:41:44 pm
  • Quote me with care
Re: Input lag VS...
« Reply #1 on: December 11, 2024, 06:34:55 am »
My concern is: will I get worse, the same or better input lag than using my freesync monitor? I mean using GroovyMAME without frame delay

Roughly:

MAME + lowlatency + no vsync + freesync -> sub-frame latency
GroovyMAME + lowlatency + vsync + frame delay -> sub-frame latency
GroovyMAME + lowlatency + vsync + no frame delay -> a bit over 1 frame latency

since I've heard that feature is problematic and has to be configured game by game

Where did you hear that exactly? There's so much hate in the world.

Well, yes, it has to be configured per-game, it can be done in seconds through the ui, and the value is saved and remembered automatically.

Quote
(I have no time and will for that)

Fair enough. GroovyMAME intends to offer a solution even if that requires some involvement from the user's part.

If you have no time but have some cash, GroovyMiSTer is the solution that beats everything else today.
Important note: posts reporting GM issues without a log will be IGNORED.
Steps to create a log:
 - From command line, run: groovymame.exe -v romname >romname.txt
 - Attach resulting romname.txt file to your post, instead of pasting it.

CRT Emudriver, VMMaker & Arcade OSD downloads, documentation and discussion:  Eiusdemmodi

Elaphe666

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 22
  • Last login:Today at 12:31:10 am
  • I want to build my own arcade controls!
Re: Input lag VS...
« Reply #2 on: December 11, 2024, 08:34:21 am »
I've heard that high values of frame delay can make some games skip animations with very bad results. Another problem is having to check the performance for every game in my collection in order to see if framesync demands more power than my CPU can provide. That idea is a no-go for me.

After reading your comments in a Youtube video I've learnt that frame delay is intented to cure the lag added by vsync, which has to be set to on with GroovyMAME and that MAME + lowlatency + no vsync + freesync grants a value of input lag as low as GroovyMAME with frame delay set to the maximum.

So, considering the above I've decided to abandon my proyect of using GroovyMAME with my CRT TV and buy a big modern TV set with freesync. I'm very happy with the looks of MAME on a TFT screen specially now with BGFX and shaders such as CRT_geom_deluxe, and I yet have to try some shaders converted from Retroarch, which look amazing (although I think they are for 4K only).

I will not dump my CRT TV yet, as I will focus on the FPGAs. It's a shame the lack of supported games, but that may change a lot in the future. What is GroovyMister? A MAME core for Mister? Any advantages over the MAME core. I thought the MAME core using a FPGA was already the perfect solution for a CRT and no lag at all.

donluca

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 275
  • Last login:Today at 06:15:06 am
  • I want to build my own arcade controls!
Re: Input lag VS...
« Reply #3 on: December 11, 2024, 02:56:31 pm »
Always remember that CRTs add virtually 0 lag.

With a LCD you might have sub-frame input lag on the machine side, but 25638798209852 minutes of lag because of the panel.
On a scale of fakeness, from more genuine to more fake, we'd have:

1.- Plastic plants (cf. Fake Plastic Trees)
2.- Inflatable dolls
3.- Arcade cabinets with LCD monitors

Elaphe666

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 22
  • Last login:Today at 12:31:10 am
  • I want to build my own arcade controls!
Re: Input lag VS...
« Reply #4 on: December 20, 2024, 06:55:32 am »
Always remember that CRTs add virtually 0 lag.

With a LCD you might have sub-frame input lag on the machine side, but 25638798209852 minutes of lag because of the panel.

Correct me if I am wrong, but aren't modern monitors and TVs achieving response times of around 1 ms? I don't think that is even noticeable.

Also, when Calamity said "sub-frame latency", how many milliseconds are we talking about?

donluca

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 275
  • Last login:Today at 06:15:06 am
  • I want to build my own arcade controls!
Re: Input lag VS...
« Reply #5 on: December 21, 2024, 05:55:23 pm »
I think you're mixing up the panel's own response time which indeed is negligible today to the TV/monitor's own input lag.

I haven't done much research because I'm not really into LCDs, but I know that some are very good *IF* you feed them their native resolution, otherwise if they have to do any kind of scaling they'll add input lag.

But AFAIK most of the normal stuff you'd find around, even so-called "gaming" monitors, have definitely input lag.

Look at RTings.com to see some reviews and measurement.

This is their best monitor at the moment with the lowest lag:

Quote
Input Lag
Native Resolution @ Max Hz
 3.7 ms
Native Resolution @ 120Hz
 6.0 ms
Native Resolution @ 60Hz
 11.8 ms
Backlight Strobing (BFI)
 N/A
The Samsung Odyssey OLED G8/G80SD S32DG80 has low input lag for a responsive feel when VRR Control is 'Off'. However, when it's turned on, the input lag at the max refresh rate is 24.6ms.

https://www.rtings.com/monitor/reviews/samsung/odyssey-oled-g8-g80sd-s32dg80

Here you'll be using it with VRR on and at the max refresh rate is 24.6ms which is pretty hideous (over 1 frame of lag)
You'll be using it at much lower refresh rates for arcade games so it's going to be even worse.

https://www.rtings.com/monitor/reviews/
On a scale of fakeness, from more genuine to more fake, we'd have:

1.- Plastic plants (cf. Fake Plastic Trees)
2.- Inflatable dolls
3.- Arcade cabinets with LCD monitors

Elaphe666

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 22
  • Last login:Today at 12:31:10 am
  • I want to build my own arcade controls!
Re: Input lag VS...
« Reply #6 on: December 23, 2024, 07:23:40 am »
Understood. On the other hand how much input lag does vsync add, arround 50 ms?
« Last Edit: December 23, 2024, 07:25:26 am by Elaphe666 »

donluca

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 275
  • Last login:Today at 06:15:06 am
  • I want to build my own arcade controls!
Re: Input lag VS...
« Reply #7 on: December 23, 2024, 02:58:42 pm »
A single frame, 16ms

EDIT: I just noticed all the measurements for that monitor were taken at native resolution.
You can double (or even worse) the lag for other resolutions since the monitor has to do upscaling.

EDIT: seriously, if lag is that game-breaking to you, just get a CRT and be over it.
That's what we've all done.
« Last Edit: December 23, 2024, 03:01:11 pm by donluca »
On a scale of fakeness, from more genuine to more fake, we'd have:

1.- Plastic plants (cf. Fake Plastic Trees)
2.- Inflatable dolls
3.- Arcade cabinets with LCD monitors

Calamity

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 7457
  • Last login:January 01, 2025, 12:41:44 pm
  • Quote me with care
Re: Input lag VS...
« Reply #8 on: December 23, 2024, 04:31:45 pm »
Also, when Calamity said "sub-frame latency", how many milliseconds are we talking about?

Sub-frame latency is another way to say next frame response. In ms, it's whatever is below 1 frame (around 16.67 ms). This means there is a chance that input registered in the current frame has an effect on the next one.

A VRR monitor can have as low latency or even lower than a CRT, and without the need of frame delay. Freesync/G-sync is equivalent to "frame delay 10".

Here you can see the results I got some time ago on an LG with Freesync, the measured latency end-to-end is around 4 ms.
https://github.com/mamedev/mame/pull/5901

On a CRT with a high frame delay, I've measured around 4.5-4.7 ms. But this is on ideal conditions.

Anyway the absolute minimum is determined by the amount of time it takes to emulate a frame. So if one frame takes 5 ms to be emulated, the end-to-end latency will be 5 ms + the rest of system overhead.

So the values above were measured on a game that is emulated very fast (sf2, at least it was fast back in 2019, not so much nowadays), thus allowing a high frame delay value.

The emulation time affects every setup, it's not a display thing. It's just that with a Freesync monitor, you don't need to manually find the the frame delay value. Freesync is like automatic frame delay.

GroovyMiSTer implements automatic frame delay. With GroovyMiSTer I've measured latency around 3-3.5 ms on a CRT.

Anyway I wouldn't trust an LCD monitor reported figures unless I measured it directly. They are much like a black box.

Important note: posts reporting GM issues without a log will be IGNORED.
Steps to create a log:
 - From command line, run: groovymame.exe -v romname >romname.txt
 - Attach resulting romname.txt file to your post, instead of pasting it.

CRT Emudriver, VMMaker & Arcade OSD downloads, documentation and discussion:  Eiusdemmodi