The NEW Build Your Own Arcade Controls

Software Support => GroovyMAME => Topic started by: gabe on July 05, 2011, 10:36:24 am

Title: GroovyMAME selecting modelines that my system doesn't like?
Post by: gabe on July 05, 2011, 10:36:24 am
I originally posted about this issue in the GroovyArcade Linux thread (http://forum.arcadecontrols.com/index.php?topic=107620.msg1197412#msg1197412) but decided to bring the discussion over here, as I believe it is a better fit.

Using GroovyMAME in Linux, I have found several games which crash MAME right around the time a modeline is selected. With logging, I get something similar to this (from baddudes):

Code: [Select]
Parsing mame.ini
SwitchRes: Found output connector 'VGA-0'
SwitchRes: Monitor: cga Orientation: horizontal Aspect 4:3
SwitchRes v0.013: [baddudes.zip] (1) horizontal (256x240@57.39)->(256x240@57.39)->(256x240@57.39)
SwitchRes: # baddudes.zip 256x240@57.39 15.2663Khz
SwitchRes:      ModeLine          "256x240x57.39" 5.251606 256 272 296 344 240 244 247 266 -HSync -VSync
SwitchRes: Setting Option -redraw 0
SwitchRes: Setting Option -rotate
SwitchRes: Setting Option -nothrottle
SwitchRes: Setting Option -refreshspeed
SwitchRes: Setting Option -waitvsync
SwitchRes: Xrandr ADD VGA-0:    ModeLine          "256x240x57.39" 5.251606 256 272 296 344 240 244 247 266 -HSync -VSync
SwitchRes: Running 'xrandr  --newmode      "256x240x57.39" 5.251606 256 272 296 344 240 244 247 266 -HSync -VSync'
SwitchRes: Running 'xrandr  --addmode VGA-0 256x240x57.39'
SwitchRes: Setting Option -resolution 256x240x32@57.392092
Build version:      0.143 (Jun 29 2011)
Build architecure:  SDLMAME_ARCH=
Build defines 1:    SDLMAME_UNIX=1 SDLMAME_X11=1 SDLMAME_LINUX=1
Build defines 1:    LSB_FIRST=1 PTR64=1 DISTRO=generic SYNC_IMPLEMENTATION=tc
SDL/OpenGL defines: SDL_COMPILEDVERSION=1214 USE_OPENGL=1 USE_DISPATCH_GL=1
Compiler defines A: __GNUC__=4 __GNUC_MINOR__=6 __GNUC_PATCHLEVEL__=0 __VERSION__="4.6.0 20110603 (prerelease)"
Compiler defines B: __amd64__=1 __x86_64__=1 __unix__=1
Compiler defines C: __USE_FORTIFY_LEVEL=0
SDL Device Driver     : x11
SDL Monitor Dimensions: 640 x 480
Enter sdlwindow_init
Using SDL single-window OpenGL driver (SDL 1.2)
Leave sdlwindow_init
 640x 480 -> 0.001600
 304x 224 -> 0.000015
 288x 224 -> 0.000020
 256x 240 -> 2.000000
Loaded opengl shared library: <default>

I have found that I can avoid the above crash by creating a game specific ini such as:

Code: [Select]
cleanstretch              1
switchres                 0

or by specifying a different resolution such as:

Code: [Select]
resolution                512x480@57.41
I should note that when I manually force a resolution, X windows then stays in that resolution after exiting MAME.

Does anyone have any thoughts/ideas/solutions?

If it helps, I'm using a Wells-Gardner 25K7191 with the following monitor_specs0 in mame.ini:

Code: [Select]
monitor_specs0            15100.00-16800.00,47.00-63.00,2.187,4.688,6.719,0.190,0.191,1.018,0,0,288,448

Title: Re: GroovyMAME selecting modelines that my system doesn't like?
Post by: bitbytebit on July 05, 2011, 12:51:01 pm
I originally posted about this issue in the GroovyArcade Linux thread (http://forum.arcadecontrols.com/index.php?topic=107620.msg1197412#msg1197412) but decided to bring the discussion over here, as I believe it is a better fit.

Using GroovyMAME in Linux, I have found several games which crash MAME right around the time a modeline is selected. With logging, I get something similar to this (from baddudes):

Code: [Select]
Parsing mame.ini
SwitchRes: Found output connector 'VGA-0'
SwitchRes: Monitor: cga Orientation: horizontal Aspect 4:3
SwitchRes v0.013: [baddudes.zip] (1) horizontal (256x240@57.39)->(256x240@57.39)->(256x240@57.39)
SwitchRes: # baddudes.zip 256x240@57.39 15.2663Khz
SwitchRes:      ModeLine          "256x240x57.39" 5.251606 256 272 296 344 240 244 247 266 -HSync -VSync
SwitchRes: Setting Option -redraw 0
SwitchRes: Setting Option -rotate
SwitchRes: Setting Option -nothrottle
SwitchRes: Setting Option -refreshspeed
SwitchRes: Setting Option -waitvsync
SwitchRes: Xrandr ADD VGA-0:    ModeLine          "256x240x57.39" 5.251606 256 272 296 344 240 244 247 266 -HSync -VSync
SwitchRes: Running 'xrandr  --newmode      "256x240x57.39" 5.251606 256 272 296 344 240 244 247 266 -HSync -VSync'
SwitchRes: Running 'xrandr  --addmode VGA-0 256x240x57.39'
SwitchRes: Setting Option -resolution 256x240x32@57.392092
Build version:      0.143 (Jun 29 2011)
Build architecure:  SDLMAME_ARCH=
Build defines 1:    SDLMAME_UNIX=1 SDLMAME_X11=1 SDLMAME_LINUX=1
Build defines 1:    LSB_FIRST=1 PTR64=1 DISTRO=generic SYNC_IMPLEMENTATION=tc
SDL/OpenGL defines: SDL_COMPILEDVERSION=1214 USE_OPENGL=1 USE_DISPATCH_GL=1
Compiler defines A: __GNUC__=4 __GNUC_MINOR__=6 __GNUC_PATCHLEVEL__=0 __VERSION__="4.6.0 20110603 (prerelease)"
Compiler defines B: __amd64__=1 __x86_64__=1 __unix__=1
Compiler defines C: __USE_FORTIFY_LEVEL=0
SDL Device Driver     : x11
SDL Monitor Dimensions: 640 x 480
Enter sdlwindow_init
Using SDL single-window OpenGL driver (SDL 1.2)
Leave sdlwindow_init
 640x 480 -> 0.001600
 304x 224 -> 0.000015
 288x 224 -> 0.000020
 256x 240 -> 2.000000
Loaded opengl shared library: <default>

I have found that I can avoid the above crash by creating a game specific ini such as:

Code: [Select]
cleanstretch              1
switchres                 0

or by specifying a different resolution such as:

Code: [Select]
resolution                512x480@57.41
I should note that when I manually force a resolution, X windows then stays in that resolution after exiting MAME.

Does anyone have any thoughts/ideas/solutions?

If it helps, I'm using a Wells-Gardner 25K7191 with the following monitor_specs0 in mame.ini:

Code: [Select]
monitor_specs0            15100.00-16800.00,47.00-63.00,2.187,4.688,6.719,0.190,0.191,1.018,0,0,288,448



Does it do it on the ISO image too?  Hadn't seen if you had tested just the liveCD to double check that there isn't some other factor happening besides the actual groovymame having a bug.  If not already tested, try the ISO and see if the behavior can be repeated there too.  That would help track down where things are going wrong quicker, since it might be the X Windows/compiler builds/versions or something else like that. 
Title: Re: GroovyMAME selecting modelines that my system doesn't like?
Post by: gabe on July 05, 2011, 07:00:08 pm
Does it do it on the ISO image too?  Hadn't seen if you had tested just the liveCD to double check that there isn't some other factor happening besides the actual groovymame having a bug.  If not already tested, try the ISO and see if the behavior can be repeated there too.  That would help track down where things are going wrong quicker, since it might be the X Windows/compiler builds/versions or something else like that.

It works properly on the ISO. I set my monitor type to h9110 and baddudes runs perfectly at 256x240@57.41.

Just for kicks, I tried setting my monitor type to h9110 on my Arch install, but no dice.

I'm more than happy to experiment, take shots in the dark, etc... But I don't even really know where to begin with this particular issue.  ???
Title: Re: GroovyMAME selecting modelines that my system doesn't like?
Post by: bitbytebit on July 05, 2011, 08:27:33 pm
Does it do it on the ISO image too?  Hadn't seen if you had tested just the liveCD to double check that there isn't some other factor happening besides the actual groovymame having a bug.  If not already tested, try the ISO and see if the behavior can be repeated there too.  That would help track down where things are going wrong quicker, since it might be the X Windows/compiler builds/versions or something else like that.

It works properly on the ISO. I set my monitor type to h9110 and baddudes runs perfectly at 256x240@57.41.

Just for kicks, I tried setting my monitor type to h9110 on my Arch install, but no dice.

I'm more than happy to experiment, take shots in the dark, etc... But I don't even really know where to begin with this particular issue.  ???
I'm guessing it's somewhere with how the OpenGL, DRM, X Windows, ATI Driver for X, are compiled that would be causing the crash.  I'm not sure what versions and how Arch sets these up, but possibly that is causing the problem.  Seems like it's crashing out with OpenGL, so possibly the OpenGL version is too new and there are quite a few issues with the newest OpenGL and Mame, possibly that or there's yet another issue now with certain resolutions.
Title: Re: GroovyMAME selecting modelines that my system doesn't like?
Post by: Calamity on July 06, 2011, 07:29:54 am
I'd check if its only a problem with that exact modeline or rather it's a low dotclock issue. If you see the modeline generated for the cga monitor, it has a dotclock of 5.251606 MHz. I'd test games that have use a dotclock that's a little lower or higher, to see if they're affected too, i.e. argus (-monitor_orientation vertical) and toki.
Title: Re: GroovyMAME selecting modelines that my system doesn't like?
Post by: gabe on July 06, 2011, 01:34:33 pm
I'd check if its only a problem with that exact modeline or rather it's a low dotclock issue. If you see the modeline generated for the cga monitor, it has a dotclock of 5.251606 MHz. I'd test games that have use a dotclock that's a little lower or higher, to see if they're affected too, i.e. argus (-monitor_orientation vertical) and toki.

Both games fail in a similar fashion to baddudes.

I gather that this post (http://forum.arcadecontrols.com/index.php?topic=106405.msg1128440#msg112844) about Xorg ignoring dotclocks below 5mhz no longer applies?

Also - if it makes any difference, I'm launching from wah!cade.
Title: Re: GroovyMAME selecting modelines that my system doesn't like?
Post by: Calamity on July 06, 2011, 04:28:24 pm
Both games fail in a similar fashion to baddudes.

I gather that this post (http://forum.arcadecontrols.com/index.php?topic=106405.msg1128440#msg112844) about Xorg ignoring dotclocks below 5mhz no longer applies?

Also - if it makes any difference, I'm launching from wah!cade.

Interesting, it would be good to see if there's a clear pattern, i.e. all modelines above a given dotclock work, and the ones below don't, etc.

All those dotclock issues were overcome by bitbytebit in the GroovyArcade build by means of different patches, which I can't remember right now, but lately I believe it was only necessary to patch the legacy Radeon driver, anyway if applying the same patches does not fix things in Arch Linux then maybe you should find all instances in the kernel where the dotclocks are checked.
Title: Re: GroovyMAME selecting modelines that my system doesn't like?
Post by: bitbytebit on July 06, 2011, 06:52:13 pm
Both games fail in a similar fashion to baddudes.

I gather that this post (http://forum.arcadecontrols.com/index.php?topic=106405.msg1128440#msg112844) about Xorg ignoring dotclocks below 5mhz no longer applies?

Also - if it makes any difference, I'm launching from wah!cade.

Interesting, it would be good to see if there's a clear pattern, i.e. all modelines above a given dotclock work, and the ones below don't, etc.

All those dotclock issues were overcome by bitbytebit in the GroovyArcade build by means of different patches, which I can't remember right now, but lately I believe it was only necessary to patch the legacy Radeon driver, anyway if applying the same patches does not fix things in Arch Linux then maybe you should find all instances in the kernel where the dotclocks are checked.

The newest versions of Xorg/ATI drivers with DRM enabled actually don't use that low dotclock value, since it's all built into the DRM layer of the kernel now instead of the xorg drivers.  The kernel, if patched, should be the same as the ISO uses.  Seems like something is not the same, but I'm not sure what it'd be.  Actually odd certain games work and others don't.  Yet if the kernel is right then that part shouldn't happen, since it's the central point all the modeline calculations are done. 
Title: Re: GroovyMAME selecting modelines that my system doesn't like?
Post by: gabe on July 06, 2011, 08:58:24 pm
Both games fail in a similar fashion to baddudes.

I gather that this post (http://forum.arcadecontrols.com/index.php?topic=106405.msg1128440#msg112844) about Xorg ignoring dotclocks below 5mhz no longer applies?

Also - if it makes any difference, I'm launching from wah!cade.

Interesting, it would be good to see if there's a clear pattern, i.e. all modelines above a given dotclock work, and the ones below don't, etc.

All those dotclock issues were overcome by bitbytebit in the GroovyArcade build by means of different patches, which I can't remember right now, but lately I believe it was only necessary to patch the legacy Radeon driver, anyway if applying the same patches does not fix things in Arch Linux then maybe you should find all instances in the kernel where the dotclocks are checked.

The newest versions of Xorg/ATI drivers with DRM enabled actually don't use that low dotclock value, since it's all built into the DRM layer of the kernel now instead of the xorg drivers.  The kernel, if patched, should be the same as the ISO uses.  Seems like something is not the same, but I'm not sure what it'd be.  Actually odd certain games work and others don't.  Yet if the kernel is right then that part shouldn't happen, since it's the central point all the modeline calculations are done.  

6.162MHz seems to be the lowest dotclock my system will take. 5.911MHz and below results in a crash. I took the list of arcade modelines, and then searched MAWS for games by resolution. I then tested several games for each modeline in either direction of the 6.162MHz dotclock.

I'm using the same version of kernel, with the same patches applied as the ISO. I used a slightly different .config, so perhaps I should compare the two and take a closer look? If not, I suppose my next step will be to compare software versions on my Arch install to those on the ISO? ... Unless anyone has any better ideas?
Title: Re: GroovyMAME selecting modelines that my system doesn't like?
Post by: Calamity on July 07, 2011, 03:48:26 am
The newest versions of Xorg/ATI drivers with DRM enabled actually don't use that low dotclock value, since it's all built into the DRM layer of the kernel now instead of the xorg drivers.  The kernel, if patched, should be the same as the ISO uses.  Seems like something is not the same, but I'm not sure what it'd be.  Actually odd certain games work and others don't.  Yet if the kernel is right then that part shouldn't happen, since it's the central point all the modeline calculations are done. 

However I remind the ATI drivers were picking those dotclock limits from the bios itself, at least for the legacy combios part that we had to patch. Could it be that the hd2600 bios template he's using in order to make the AVGA3000 work is different to the one you used?
Title: Re: GroovyMAME selecting modelines that my system doesn't like?
Post by: gabe on July 07, 2011, 08:57:08 am
However I remind the ATI drivers were picking those dotclock limits from the bios itself, at least for the legacy combios part that we had to patch. Could it be that the hd2600 bios template he's using in order to make the AVGA3000 work is different to the one you used?

That shouldn't be the case, as I copied it directly from the Groovy ISO. The file in question should be hd2600.bin, correct?

If so, my hd2600.bin is the same exact size as the hd2600.bin found on the ISO. The only difference is permissions. Mine is:

Code: [Select]
-rw-r--r--
While the ISO's is:

Code: [Select]
-rwxr-xr-x
Am I wrong in assuming that this shouldn't make a difference? I can chmod and test when I get home.
Title: Re: GroovyMAME selecting modelines that my system doesn't like?
Post by: Calamity on July 07, 2011, 09:07:42 am
Am I wrong in assuming that this shouldn't make a difference? I can chmod and test when I get home.

That shouldn't make a difference but probably I'm not the best one to answer that, just throwing some ideas to test.
Title: Re: GroovyMAME selecting modelines that my system doesn't like?
Post by: bitbytebit on July 08, 2011, 04:29:53 pm
It seems like something is different, but probably if not the kernel config it probably is somewhere in how OpenGL and X Windows is setup or versions they are using.  Not sure though, possibly logs of X Windows during the crashes and dmesg output with DRM layer using a higher debug level.
Title: Re: GroovyMAME selecting modelines that my system doesn't like?
Post by: gabe on July 11, 2011, 04:08:09 pm
An update... simply to show that I haven't given up ;)

Arch Linux uses a rolling release cycle... So all of the packages are generally bleeding-edge. Going backwards can be rather difficult, particularly if you are trying to downgrade to a version of a package that was never installed on your system.

In any event, I started comparing versions from Arch to Groovy, and found that Xorg, DRM, and Mesa are all newer on my Arch install than on the Groovy CD. I was unable to determine the version of ATI drivers used on the Groovy CD, but I suspect my Arch install is also using a newer version.

I'm finally going to start the dreaded (and painstaking) process of manually compiling (and recompiling) suspected packages and downgrading where necessary. I hope to pinpoint exactly what is breaking my system... Not only for myself, but also because I'd like to contribute something mildly useful to this fine project  :cheers:
Title: Re: GroovyMAME selecting modelines that my system doesn't like?
Post by: lettuce on July 20, 2011, 02:25:45 pm
Thanks for the info, ill look into this  :cheers:
Title: Re: GroovyMAME selecting modelines that my system doesn't like?
Post by: Ansa89 on January 26, 2012, 07:57:26 am
An update... simply to show that I haven't given up ;)

Arch Linux uses a rolling release cycle... So all of the packages are generally bleeding-edge. Going backwards can be rather difficult, particularly if you are trying to downgrade to a version of a package that was never installed on your system.

In any event, I started comparing versions from Arch to Groovy, and found that Xorg, DRM, and Mesa are all newer on my Arch install than on the Groovy CD. I was unable to determine the version of ATI drivers used on the Groovy CD, but I suspect my Arch install is also using a newer version.

I'm finally going to start the dreaded (and painstaking) process of manually compiling (and recompiling) suspected packages and downgrading where necessary. I hope to pinpoint exactly what is breaking my system... Not only for myself, but also because I'd like to contribute something mildly useful to this fine project  :cheers:
A bit later...
Try this patch (http://forum.arcadecontrols.com/index.php?topic=107620.msg1245155#msg1245155).
I don't know the naming system for Arch packets, but it should be applied to something like "xf86-video-ati-VERSION_NUMBER".
Obviously you need an ATI graphic card which uses "ati" as Xorg driver.

Hope you will solve your problem :) .