Hi vicosku,
This is the typical obscure problem that looks like it has no logical explanation. Sadly, because video drivers are a damned black box all we can do is try to figure out what may be going on under the hood.
How can the exact same modeline produce a different refresh rate with the same card? I can think of two possible causes.
The first one is an unreliable dotclock. We often think about these things as if they were fully deterministic, and hopefully they were. But in real life, at the end of the chain there's an oscillator, that may be affected by things like your room temperature. Some dotclock values may be more consistent than others. Usually all but a few are very consistent. Unfortunatelly it's not possible to know which ones are unconsistent to bypass them beforehand. This 6.63 MHz value may be problematic with your specific card. You can probably choose a different value, higher or lower, that works better with your card, and modify the modeline to use it. This can be easily done with ArcadeOSD, by testing the refresh speed (use key "5"). Once you have a satisfactory modeline, you have to put it into your game specific .ini. Pay attention to this: you must put a modeline in your ini file (yeah, a raw modeline: "modeline "320x224_60 15.61KHz 59.14Hz" 6.62 ..."), NOT a crt_range. To get that modeline, write down the values from ArcadeOSD.
The second possible cause is that a different CRTC (CRT controller) is being used since your day #2 tests. A video card has usually 2 independent CRTCs, which can be mapped to either of the outputs. The video drivers are responsible for doing this, the OS doesn't care. So maybe the first day, due to the specific sequence in which you connected and disconnected your various monitors, say CRTC #1 ended up being mapped to the DVI-D output. Then you turn off the computer, and the next day, when it boots with just the arcade monitor connected, the drivers decide it's a better idea to map the CRTC #2 to the DVI-D output. Now let's think that for same reason CRTC #1 provides slightly different refresh values than CRTC #2. (If this is the reason why you're seeing a different number in the monitor section of your logs, I don't know).
Finally, as you know GM forces the sound to be synchronize with the video refresh when the -syncrefresh option is used (almost always). This removes most sound glitches. The problem is that when you use the -frame_delay option. this sound synchronization is disabled because I still haven't found a way to make it work reliably with this option. The consequence of this is that unless you get an exact figure for the refresh rate (rare) you may have some audio glitches (the ones that are masked when sound synchronization is active). Thus, manually adjusted modelines may be required if you want to get absolutely perfect sound (as suggested above).