Ok, have a look at this log:
SwitchRes: Monitor range 15750.00-16250.00,45.00-80.00,0.900,4.700,8.400,0.062,0.062,0.740,0,0,160,288,320,448
SwitchRes: Monitor range 48363.00-50750.00,50.00-61.00,0.369,2.092,2.462,0.062,0.124,0.600,0,0,480,800,0,0
SwitchRes: v0.014b:[bnzabros] Calculating best video mode for 496x384@57.524158 orientation: normal
rng(0): 496 x 384_57.524i 15.790 [integ] scale(1, 1, 1) diff(0.00, 0.00, 0.0000) ratio(1.000, 1.000)
rng(1): 1072 x 800_57.524p 48.378 [fract] scale(2, 2, 1) diff(0.00, 11.41, 0.0000) ratio(2.161, 2.083)
You're defining crt_range0 to interlace input resolutions between 320-448. This means that 384 will be considered as a candidate to be interlaced in that range. So the first step is to set this range a bit higher, like this:
crt_range0 15750.00-16250.00,45.00-80.00,0.900,4.700,8.400,0.062,0.062,0.740,0,0,160,288,448,480
Second, you need to redirect those resolutions to the second range. You do this by lowering the progressive range there:
crt_range1 48363.00-50750.00,50.00-61.00,0.369,2.092,2.462,0.062,0.124,0.600,0,0,384,800,0,0
It's not quite clear to me why you want these resolutions doublescanned, shouldn't your monitor be able to show them at real 25 kHz (maybe you already tested this, I can't remember right now).
Regarding the games that originally run interlaced, like Popeye, we have a problem because MAME just reports 448@60Hz, so there's no way GM can figure out these were interlaced. The only workaround is to create an ini file for these games removing crt_range1 (set as "auto").