Software Support > GroovyMAME

Input lag VS...

<< < (2/2)

donluca:
I think you're mixing up the panel's own response time which indeed is negligible today to the TV/monitor's own input lag.

I haven't done much research because I'm not really into LCDs, but I know that some are very good *IF* you feed them their native resolution, otherwise if they have to do any kind of scaling they'll add input lag.

But AFAIK most of the normal stuff you'd find around, even so-called "gaming" monitors, have definitely input lag.

Look at RTings.com to see some reviews and measurement.

This is their best monitor at the moment with the lowest lag:


--- Quote ---Input Lag
Native Resolution @ Max Hz
 3.7 ms
Native Resolution @ 120Hz
 6.0 ms
Native Resolution @ 60Hz
 11.8 ms
Backlight Strobing (BFI)
 N/A
The Samsung Odyssey OLED G8/G80SD S32DG80 has low input lag for a responsive feel when VRR Control is 'Off'. However, when it's turned on, the input lag at the max refresh rate is 24.6ms.
--- End quote ---

https://www.rtings.com/monitor/reviews/samsung/odyssey-oled-g8-g80sd-s32dg80

Here you'll be using it with VRR on and at the max refresh rate is 24.6ms which is pretty hideous (over 1 frame of lag)
You'll be using it at much lower refresh rates for arcade games so it's going to be even worse.

https://www.rtings.com/monitor/reviews/

Elaphe666:
Understood. On the other hand how much input lag does vsync add, arround 50 ms?

donluca:
A single frame, 16ms

EDIT: I just noticed all the measurements for that monitor were taken at native resolution.
You can double (or even worse) the lag for other resolutions since the monitor has to do upscaling.

EDIT: seriously, if lag is that game-breaking to you, just get a CRT and be over it.
That's what we've all done.

Calamity:

--- Quote from: Elaphe666 on December 20, 2024, 06:55:32 am ---Also, when Calamity said "sub-frame latency", how many milliseconds are we talking about?

--- End quote ---

Sub-frame latency is another way to say next frame response. In ms, it's whatever is below 1 frame (around 16.67 ms). This means there is a chance that input registered in the current frame has an effect on the next one.

A VRR monitor can have as low latency or even lower than a CRT, and without the need of frame delay. Freesync/G-sync is equivalent to "frame delay 10".

Here you can see the results I got some time ago on an LG with Freesync, the measured latency end-to-end is around 4 ms.
https://github.com/mamedev/mame/pull/5901

On a CRT with a high frame delay, I've measured around 4.5-4.7 ms. But this is on ideal conditions.

Anyway the absolute minimum is determined by the amount of time it takes to emulate a frame. So if one frame takes 5 ms to be emulated, the end-to-end latency will be 5 ms + the rest of system overhead.

So the values above were measured on a game that is emulated very fast (sf2, at least it was fast back in 2019, not so much nowadays), thus allowing a high frame delay value.

The emulation time affects every setup, it's not a display thing. It's just that with a Freesync monitor, you don't need to manually find the the frame delay value. Freesync is like automatic frame delay.

GroovyMiSTer implements automatic frame delay. With GroovyMiSTer I've measured latency around 3-3.5 ms on a CRT.

Anyway I wouldn't trust an LCD monitor reported figures unless I measured it directly. They are much like a black box.

Navigation

[0] Message Index

[*] Previous page

Go to full version