Hi mamenewb100,
Well the thickness of the scanlines cannot be controlled by software, it's something that is inherent to the monitor. So a high resolution monitor (31 KHz) has thinner scanlines, as it has to draw the double amount of lines in the same space.
So yes, basically what you do is to draw two scanlines for each one from the original resolution. This produces a typical "blocky" effect as compared to the real low resolution display, but it's not too bad provided you don't stretch things.
GroovyMAME scaling is, i.e.: 320 x 224p (15 kHz) -> 640 x 448p (31 kHz)
Notice we shouldn't really need to scale xres, just yres is relevant for frequency matters, but we do it because most people would get uneasy if we just scaled in one direction (I mean using 320 x 448p in this sample would look *exactly* the same).
Now, how this is done depends on the OS. In Linux we can create resolutions on the fly. So we just calculate the scaled modeline whatever it is and create it, that's all.
In Windows we can't create resolutions on the fly, so chances are the resulting scaled resolution won't be available. That's why we need to precalculate a table with all the required scaled resolutions, install it, and have it ready to use. Unfortunately we can't do this with the AVGA, so it will need to grab the closest resolution and possibly stretch things, it's somewhat unpredictable and that's why the results tend to suck when scaling.