Surely with the slider options in MAME you can mimick the original behaviour much better than most of the Cave ports do. But it seems nobody is concerned enough to make the effort and share the results after all this time
I've been interested in that for quite a while but it's too much for one person, it would have to be a common effort/contribution. Quite a bit were interested a few years ago but two-three changes in MAME kinda pushed the issue to later (a couple driver performance optimizations, and the removal of saving CPU slider values).
Yet after the optimizations were done, and now the sliders saving is back in GM, it seems too much time has passed. I still frequently see people interested in playing the cv1k in emulation as well as possible, but either they're bewildered by Groovy's rumored complexity (don't ask me why, for that purpose there isn't much to do and configure, yet more than a couple settings seem to scare people), or they're not aware about the sliders, or they just don't believe it.
Wasted opportunity IMO, not showing off what Groovy+frame_delay can achieve delay-wise included.
AFAIK some compatibility issues with baseline MAME in RetroArch might be solved soon, which might mean run-ahead finally working for them, if that's confirmed then you can bet no one who was interested in Groovy for playing those cv1k games will even consider trying anymore, they'll just go to RA and push run-ahead mindlessly.
Looks like a commendable effort but some things are indeed questionable. If he's getting "100 % consistency" with frame delay at 6 he's either, not using GM properly or not taking enough samples, unless I'm missing something. Not posting his PC, Windows and GM configurations also makes his effort futile in these cases. And we don't know how "good" the controller he's using on Windows for this is, either (does he?).
Not sure, the stick is a VX-SA though, allegedly the fastest known stick according to that website:
http://www.teyah.net/sticklag/results.html, but in windows (and which windows) I dunno.
I don't find the figures anything uncommon, anyhow. If those Sega Saturn and PS2 games delay 3 frames for him (which I always found to be perfectly responsive), the GM figures are basically optimal -- I'd hardly believe that the CV-1000 games' native lag is below or the same as PS2 Dai-Ou-Jou or SS Batsugun, for instance. That's the other big issue of that chart -- it lacks tests of the CV-1000 PCB counterparts. (And, for the only PCB he measured, he didn't bother to make the comparison with GM. (!))
Not sure for all that, I was taking Futari as reference since according to M2 the pcb lags by the equivalent of 2 frames, and if you remember the 360 port offers to adjust your delay to that. That's what he measured as well
but with VSYNC OFF...why? it's supposed to stay vsynced and the delay still be that, so that's how it should have measured.
There's belief probably inherited from the PC gaming world that vsync always adds significant lag no matter what, but when you try to explain people that something like what Groovy does allows you to keep the sync, yet keep the cost under 1 frame, down to even mere 2~3ms under conditions, they simply don't believe or don't understand what it means (like some of those RA/ShmupArch users who think even the game's default lag is input delay to eliminate period *shrug*)
In any case, we can measure the driver's default delay in Groovy (does the driver lags by 2 or 3 frames by default or is it different per-game) that's something I'll probably do when I have the time to make sure it's done properly, and then deduce what can be achieved with frame_delay.
But that still doesn't inform about the upper limits nor what one can hope to achieve with what CPU...
Really it doesn't have to be 9, but a stable 8 or even 7 puts you in the decently achievable 'under half-a-frame-delay' zone which is like having no input lag at all assuming your hardware/OS are clean of most additional delay.