I'm not exactly sure what the heart of your question is, but I'll try to take a shot at it.
Calamity's drivers allow windows users to create manual modelines. Windows doesn't have any built-in support for this, so you need hacked drivers. In short, modelines describe exactly how the raster scan is timed. This determines the pixel resolution and refresh rate.
For a CRT monitor, the pixel resolution and refresh rate are flexible (within certain bounds). Running an arcade game in any pixel resolution other than the one it was intended for (the native resolution) will cause distortion of the image. Running it with a refresh rate other than the native rate will cause either tearing or skipping. Both of these effects can range from barely discernible to completely unplayable.
There are limitations on what modelines a given monitor can sync with. This is most easily described by a little arithmetic below. Basically, a modern monitor scans too quickly to display the old arcade resolutions, so something has to give. It seems like the most common solution is to double the pixel resolution and keep the refresh rate equal to the native rate.
Using an example from the most awesome game ever (Burgertime).
Native resolution is 240x240 pixels at 57.44 Hz
240 lines * 110% * 57.44 frames per second = 15164 ~ 15.1 kHz, which is right in the heart of the sync range for an arcade monitor, but well below the Hsync limit for most modern monitors.
Where does the 110% come from? It's just a rule of thumb, not an absolute. It takes into account the number of cycles required for the electron beam to sweep back to the right at the end of each line (the horizontal synchronization pulse), and the amount of time for the beam to reset to the top left corner after the last line is painted (the V sync). If you ever write manual modelines, you can figure out exactly how many cycles are required.
To get burgertime up to the minimum scan rate that my monitor can display (30-130 kHz), I have to double the pixel resolution.
480 lines *110% * 57.44 Hz ~ 30.3 kHz.
The visual distortion caused by doubling the number of pixels is negligible (in my personal opinion). If you use a non-integer scaling factor, it usually has very noticeable effects on low-res graphics
Running the game at 60 Hz on my LCD laptop monitor with triple buffering causes the hotdogs to skip around in a distracting way instead of waddling in their normal endearing manner. Running the game with frame sync causes a noticeable speedup (4.5% faster). While this doesn't totally kill Burgertime for me, it hurts the high scores a little bit.
There is a LOT more that can be said about all of the above. If you are really interested in modelines, the old xwindows documentation is quite good.