I've been doing some lag testing on my Dell 1913 TN panel. System one is JROK FPGA hardware emulation. System 2 is groovymame171 with framedelay 9. The game is Defender and I'm testing the shot response.
I'm recording video at 120fps and I've got multiple LEDs mounted vertically to give a better estimate of when the button was pressed. I count in 'half-frames' of 8ms and then adjust. I've counted over 300 groovymame shots and over 100 JROK shots.
I take readings at two ship positions: High (approx 1/4 down the screen) and Low (approx 3/4 down the screen). So these positions are 8ms apart in a 17ms scanout.
The average result for JROK is: High = 30ms, Low = 29ms
The average result for groovymame is: High = 44ms, Low = 35ms
On original hardware Defender polls inputs every 8ms and the expected response time is the same in both halves of the screen, so the JROK result is as I would expect (but maybe a bit slower than I expected)
For groovymame, I was expecting it to be faster than JROK at the top of the screen and slower at the bottom. Looking at the results above I would expect the groovymame High result to be 27ms (35-8) and I would have been happy with that.
So... That's the odd result. And now I'm trying to explain it (and then maybe fix it / or improve the emulation). I'm working my way through the MAME documentation and trying to understand the code but it's all new to me so this could be a long project.
I'm mentioning it here in the hopes of finding someone knowledgeable and interested enough to help me out. If I manage to work it out myself I'm sure that will be fulfilling but I would be even happier if someone gave me a solution tomorrow