Main Restorations Software Audio/Jukebox/MP3 Everything Else Buy/Sell/Trade
Project Announcements Monitor/Video GroovyMAME Merit/JVL Touchscreen Meet Up Retail Vendors
Driving & Racing Woodworking Software Support Forums Consoles Project Arcade Reviews
Automated Projects Artwork Frontend Support Forums Pinball Forum Discussion Old Boards
Raspberry Pi & Dev Board controls.dat Linux Miscellaneous Arcade Wiki Discussion Old Archives
Site News

Unread posts | New Replies | Recent posts | Rules | Chatroom | Wiki | File Repository | RSS | Submit news


  

Author Topic: Switchres: modeline generator engine  (Read 81300 times)

0 Members and 2 Guests are viewing this topic.

ahofle

  • Trade Count: (+1)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 4524
    • Arcade Ambience Project
Thanks I'll give it a shot!

bitbytebit

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 896
    • The Groovy Organization
Re: Radeon X Driver for ArcadeVGA capability and lrmc .ini file generation
« Reply #41 on: October 11, 2010, 09:31:13 pm »
Version 0.8 has improved things considerably, now the actual modelines are really read from xorg.conf and only uses the lrmc code in there if no modeline is listed for the "HxWxR" label in the Modes section (just like how all X drivers do but it uses the LRMC and not Vesa method unless you don't have the Option "ArcadeMonitor" set to a valid monitor name).  Also bug fixes and other issues hopefully are fixed and works better now, should be easier to use and get working now.
SwitchRes / GroovyMame - http://arcade.groovy.org
Modeline Generator and Mame Wrapper for Windows or Linux
LiveCD of Groovy Arcade Linux for Arcade Monitors
GroovyMame - generate arcade resolutions like advancemame
--
The Groovy Organization

Calamity

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 5469
Re: Radeon X Driver for ArcadeVGA capability and lrmc .ini file generation
« Reply #42 on: October 12, 2010, 05:21:30 am »
Version 0.8 has improved things considerably, now the actual modelines are really read from xorg.conf and only uses the lrmc code in there if no modeline is listed for the "HxWxR" label in the Modes section (just like how all X drivers do but it uses the LRMC and not Vesa method unless you don't have the Option "ArcadeMonitor" set to a valid monitor name).  Also bug fixes and other issues hopefully are fixed and works better now, should be easier to use and get working now.

I'd like to summarize and see if I've understood right:

- We start running genres with proper params to get modelines and .inis done (at this point, I'm not sure if lrmc will use options from xorg.conf or from lrmc.conf). Modelines will be added to xorg.conf, and available when we restart.
- Now, we'll have ini files for Mame games with the resolution in the format WxH@R where R is a double (not integer anymore). When starting Mame, it'll push this resolution to the driver.
- The driver will parse texts labels in the Modes section of xorg.conf until it finds one that matches "WxHxR" values.
- Then, the driver will use this label to get the proper modeline from the Monitor section of xorg.conf, and use this modeline to program Radeon hardware  :applaud:
- If the driver didn't find the mode we're asking for in the modeline table, it will produce a brand new modeline on the fly by calling lrmc.

This is exactly what we needed, even more.

When I have the time, I'd like to have a look at lrmc and see how things are done. I've been playing with Win32 lrmc and noticed a strange behaviour with some resolutions. I'd like to build a Win32 binary of your version, to be able to perform some tests. Eventually, I might add some of my stuff to lrmc, though I lack the C knowlegde.

Thanks again for the good work!



CRT Emudriver, VMMaker & Arcade OSD downloads, documentation and discussion:  Eiusdemmodi

ves

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 204
Re: Radeon X Driver for ArcadeVGA capability and lrmc .ini file generation
« Reply #43 on: October 12, 2010, 05:33:40 am »
Hi, I'm going in parts
1 The ATI driver is not for the kernel (so not created radeon.ko) is for the manager Xorg (radeon.so are created) does not?
2 The driver must be installed in /usr/lib/xorg/modules/drivers/, right?
3 Once installed the driver, lrm and mame, what we should do?
 1 Creating the lrmc.conf in / etc (to be configured here?)
 2 Configure mame.
 3 Generate the modelines and ini.
 4 Create the xorg.conf (now creates all here, right?)

What else would have to do or configure? that would fail that test want to try doing?

Because making genres -cga -ff -ini xorg.conf is not configured to cga monitor, always set to d9800



Greetings.
« Last Edit: October 12, 2010, 05:43:11 am by ves »

Calamity

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 5469
Re: Radeon X Driver for ArcadeVGA capability and lrmc .ini file generation
« Reply #44 on: October 12, 2010, 06:14:21 am »
What else would have to do or configure? that would fail that test want to try doing?

Hi VeS, it will work, the tests I meant are secondary (once we have it running we'll think about it).

Because making genres -cga -ff -ini xorg.conf is not configured to cga monitor, always set to d9800

First we need to run lmrc --default to produce our lrmc.conf file.
Then edit lrmc.conf with the params of Hantarex MTC9110 (the ones I posted before).
Then run genres -ff -ini (no monitor params). This will produce .inis + modelines (added in xorg.conf).


CRT Emudriver, VMMaker & Arcade OSD downloads, documentation and discussion:  Eiusdemmodi

bitbytebit

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 896
    • The Groovy Organization
Re: Radeon X Driver for ArcadeVGA capability and lrmc .ini file generation
« Reply #45 on: October 12, 2010, 09:36:11 am »

Version 0.8 has improved things considerably, now the actual modelines are really read from xorg.conf and only uses the lrmc code in there if no modeline is listed for the "HxWxR" label in the Modes section (just like how all X drivers do but it uses the LRMC and not Vesa method unless you don't have the Option "ArcadeMonitor" set to a valid monitor name).  Also bug fixes and other issues hopefully are fixed and works better now, should be easier to use and get working now.

I'd like to summarize and see if I've understood right:

- We start running genres with proper params to get modelines and .inis done (at this point, I'm not sure if lrmc will use options from xorg.conf or from lrmc.conf). Modelines will be added to xorg.conf, and available when we restart.


lrmc will use the options from /etc/lrmc.conf or lrmc.conf in the working directory, I've left that the same while now the X server just reads the modelines and Hsync/Vsync ranges from xorg.conf (which really just serve to limit bad modelines and the default modelines from being used).  There are default modelines that the X server part likes to force on us, not so bad since the Hsync/Vsync settings eliminate any ones out of range but also is a little annoying it does that, would be nice to figure out how to only have our modelines available.

- Now, we'll have ini files for Mame games with the resolution in the format WxH@R where R is a double (not integer anymore). When starting Mame, it'll push this resolution to the driver.


In Windows it'd be WxH@R but in Linux it uses WxHxD@R and it well R is a double on my local tree here but I haven't put that patch in yet.  It's here, it's very simple actually for the Linux side but of course the Windows one looks limited by Windows (unless using SDL there I guess).  Also I am now wanting to investigate that more that I got the X Driver behaving properly, and figure out if it really only is SDL 1.3 that listens to the refresh rate, or if I can change it so both SDL1.2 and other methods all use it.  Might want to figure out ways to have MAME change the resolution with xrandr since it seems like the new and upcoming method of doing changes and possibly why they removed modeline support (xrandr can add modelines itself, and can call ours or the ones it added.  The issue is that modelines it adds aren't callable by SDL for some odd reason).

Linux patch:

Code: [Select]
diff -ru ../Mame_vanilla_0139u3_local/src/osd/sdl/video.c ./src/osd/sdl/video.c
--- ../Mame_vanilla_0139u3_local/src/osd/sdl/video.c    2010-10-08 14:10:30.000000000 -0500
+++ ./src/osd/sdl/video.c       2010-10-08 17:32:14.000000000 -0500
@@ -898,7 +898,8 @@
        const char *defdata = options_get_string(mame_options(), SDLOPTION_RESOLUTION(""));
        const char *data = options_get_string(mame_options(), name);

-       config->width = config->height = config->depth = config->refresh = 0;
+       config->width = config->height = config->depth = 0;
+       config->refresh = 0.0;
        if (strcmp(data, SDLOPTVAL_AUTO) == 0)
        {
                if (strcmp(defdata, SDLOPTVAL_AUTO) == 0)
@@ -909,7 +910,7 @@
                data = defdata;
        }

-       if (sscanf(data, "%dx%dx%d@%d", &config->width, &config->height, &config->depth, &config->refresh) < 2 && report_error)
+       if (sscanf(data, "%dx%dx%d@%lf", &config->width, &config->height, &config->depth, &config->refresh) < 2 && report_error)
                mame_printf_error("Illegal resolution value for %s = %s\n", name, data);
 }

diff -ru ../Mame_vanilla_0139u3_local/src/osd/sdl/video.h ./src/osd/sdl/video.h
--- ../Mame_vanilla_0139u3_local/src/osd/sdl/video.h    2010-02-14 12:59:50.000000000 -0600
+++ ./src/osd/sdl/video.h       2010-10-08 17:29:09.000000000 -0500
@@ -82,7 +82,7 @@
        int                                     width;                                          // decoded width
        int                                     height;                                         // decoded height
        int                                     depth;                                          // decoded depth
-       int                                     refresh;                                        // decoded refresh
+       double                                  refresh;                                        // decoded refresh

        int                                     totalColors;             // total colors from machine
 };
diff -ru ../Mame_vanilla_0139u3_local/src/osd/sdl/window.h ./src/osd/sdl/window.h
--- ../Mame_vanilla_0139u3_local/src/osd/sdl/window.h   2010-10-08 14:10:30.000000000 -0500
+++ ./src/osd/sdl/window.h      2010-10-08 17:26:46.000000000 -0500
@@ -63,7 +63,7 @@
        int                                     minwidth, minheight;
        int                                     maxwidth, maxheight;
        int                                     depth;
-       int                                     refresh;
+       double                                  refresh;
        int                                     windowed_width;
        int                                     windowed_height;
        int                                     startmaximized;

Windows patch:
Code: [Select]
diff -ru ../Mame_vanilla_0139u3_local/src/osd/windows/video.c ./src/osd/windows/video.c
--- ../Mame_vanilla_0139u3_local/src/osd/windows/video.c        2010-10-08 14:09:59.000000000 -0500
+++ ./src/osd/windows/video.c   2010-10-08 17:33:05.000000000 -0500
@@ -553,13 +553,14 @@
        const char *defdata = options_get_string(mame_options(), WINOPTION_RESOLUTION);
        const char *data = options_get_string(mame_options(), name);

-       config->width = config->height = config->refresh = 0;
+       config->width = config->height = 0;
+       config->refresh = 0.0;
        if (strcmp(data, "auto") == 0)
        {
                if (strcmp(defdata, "auto") == 0)
                        return;
                data = defdata;
        }
-       if (sscanf(data, "%dx%d@%d", &config->width, &config->height, &config->refresh) < 2 && report_error)
+       if (sscanf(data, "%dx%d@%lf", &config->width, &config->height, &config->refresh) < 2 && report_error)
                mame_printf_error("Illegal resolution value for %s = %s\n", name, data);
 }
diff -ru ../Mame_vanilla_0139u3_local/src/osd/windows/video.h ./src/osd/windows/video.h
--- ../Mame_vanilla_0139u3_local/src/osd/windows/video.h        2009-10-12 01:45:26.000000000 -0500
+++ ./src/osd/windows/video.h   2010-10-08 17:34:11.000000000 -0500
@@ -78,7 +78,7 @@
        float                           aspect;                                         // decoded aspect ratio
        int                                     width;                                          // decoded width
        int                                     height;                                         // decoded height
-       int                                     refresh;                                        // decoded refresh
+       double                                  refresh;                                        // decoded refresh
 };


diff -ru ../Mame_vanilla_0139u3_local/src/osd/windows/window.h ./src/osd/windows/window.h
--- ../Mame_vanilla_0139u3_local/src/osd/windows/window.h       2010-10-08 14:09:59.000000000 -0500
+++ ./src/osd/windows/window.h  2010-10-08 17:33:51.000000000 -0500
@@ -92,7 +92,7 @@
        int                                     fullscreen;
        int                                     fullscreen_safe;
        int                                     maxwidth, maxheight;
-       int                                     refresh;
+       double                                  refresh;
        float                           aspect;

        // rendering info


- The driver will parse texts labels in the Modes section of xorg.conf until it finds one that matches "WxHxR" values.


In theory yes, but still to be proven if it really always makes the right decision which depends on the above patch and figuring out SDL 1.3 possibly.  Which 1.3 has been in development for years, and doesn't look like they plan on ever really releasing something that is stable, I hope it happens soon because I think it solves all these issues.  I need to test it more to see if it is usable or not.

- Then, the driver will use this label to get the proper modeline from the Monitor section of xorg.conf, and use this modeline to program Radeon hardware  :applaud:
- If the driver didn't find the mode we're asking for in the modeline table, it will produce a brand new modeline on the fly by calling lrmc.


Yep, that's the nice thing about how lrmc fits in and basically does the same as they do following the use a modeline calculator in X method.  They have CVT and UMG actually available in the X Server for drivers to use, now we have lrmc support in at least the radeon driver.  So you in theory could totally trust lrmc and just do a line of Mode labels with "HxWxR" formats and it'll run those using the method you choose with the "ArcadeMonitor" Option set to either using /etc/lrmc.conf if it is set to "lrmc" or one of the supported types of monitors by lrmc like cga,ega,d9800 etc.

This is exactly what we needed, even more.

When I have the time, I'd like to have a look at lrmc and see how things are done. I've been playing with Win32 lrmc and noticed a strange behaviour with some resolutions. I'd like to build a Win32 binary of your version, to be able to perform some tests. Eventually, I might add some of my stuff to lrmc, though I lack the C knowlegde.



That would be great, because I know lrmc isn't always making the best decisions and can tell from your program that you could probably really shape it up and get things working much nicer with better modelines generated.  With that, this version of lrmc improved would be very useful to both Windows Soft15Khz users to make .ini files and modelines there and Linux users with the Radeon driver.  I'm just glad that now that the Radeon driver will play ball with us and use our Modelines, we now can figure out the actual modeline generation and vsync issues.  Which is the more exciting part of course, that Radeon driver was quite crazy to figure out how to read the Modelines and now looking at what I had to do it was not terribly complicated but totally undocumented and I see no other driver in X doing what I did.  The modelines from the config file are all there and available to the driver but it just never took the time to get them, compare them to the Modes section, and pick the ones that matched.

Thanks again for the good work!


Thanks,  Now I'm planning on looking through your modeline generator some and also the MAME sources to figure out more about how to make better modelines and make sure MAME can pick the right modeline, see about vsync and how that is done with SDL.

You help would be great, understand about the C code knowledge but if you had time it would be really helpful if you possibly could write out the general procedure you go through to take a games resolution from the MAME xml file, and turn that into the final .ini file resolution and modeline.  Just basic outline type stuff, because I am reading but it's definitely hard to fully grasp it quickly though.  I can put things into code in C but since you know the algorithms so well then will definitely be nice to combine that knowledge.  Will see how quickly I can grasp from your program what your doing, think I can but may take a week of looking each day to start catching on better.   Looking at what lrmc is something I need to do too because it's rather complicated from what I can tell, and seeing both compared together may be able to help me get the idea of how these modelines are computed to come up with the best final modeline. 



SwitchRes / GroovyMame - http://arcade.groovy.org
Modeline Generator and Mame Wrapper for Windows or Linux
LiveCD of Groovy Arcade Linux for Arcade Monitors
GroovyMame - generate arcade resolutions like advancemame
--
The Groovy Organization

bitbytebit

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 896
    • The Groovy Organization
Re: Radeon X Driver for ArcadeVGA capability and lrmc .ini file generation
« Reply #46 on: October 12, 2010, 01:17:59 pm »
I've confirmed that it does not always set the right refresh rate with MAME and SDL 1.2 so it picks the first modeline it sees.  This has two issues, one of course it's the wrong modeline, two since the X driver throws in a set of default resolutions it seems to want us to have of basic "WxH" values they always override when there's one available for that size resolution.  So I'm guessing possibly SDL 1.3 doesn't have this issue, but of course we aren't there yet.  So I like your idea of incrementing the first W size by one for each of our custom modelines to allow mame to call the correct ones with the .ini files referencing them.  I don't know if it's possible to remove those default resolutions from X because it does that in the main xorg-server part and not the radeon driver itself.  Also even if I get those to stop it, it's not a change they'd ever accept to include in the official driver (not sure they'd accept these ones I'm doing for arcade monitors, maybe just lrmc integration and an arcade setting but otherwise the philosophy of the Xorg seems to be not to let users touch modelines).  Also then we'd still be needing to do your fix too for references modelines properly.

On a good note I have been able to get doublescan and interlacing support fixed since the newer radeon driver supports those and now I can display 192 Vertical resolution games perfectly with doublescan, like the transformers one.  There's a few changes I have done to lrmc last night after the last 8a version of genres which allow this to work, by checking to see if it should do doublescan or not else by default enabling it allows it to do it for games it should probably not do it for.  I think lrmc seems a bit untamed and sometimes goes overbounds with things like interlacing and doublescan or making the wxh too large and so I've been adding rules to fixup the wxhxr and settings it normally would choose.  Hopefully it can be made to really properly calculate things a bit more controlled eventually the first time and not have to use second passes on the output.
SwitchRes / GroovyMame - http://arcade.groovy.org
Modeline Generator and Mame Wrapper for Windows or Linux
LiveCD of Groovy Arcade Linux for Arcade Monitors
GroovyMame - generate arcade resolutions like advancemame
--
The Groovy Organization

Calamity

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 5469
Re: Radeon X Driver for ArcadeVGA capability and lrmc .ini file generation
« Reply #47 on: October 12, 2010, 03:43:34 pm »
Hi bitbytebit,

We also have that problem with default video modes appearing on us in Windows. Fortunately custom modelines override default modes when they match. But as room for video modes is very limited (60 by default, 200 with my hacked drivers) and these modes waste space we have to "restrict" them, there's a registry key to do this.

It would be great you got SDL to properly deal with vertical refresh. In Windows we have this "integer bottleneck" in the DirectX functions (who would ever want to deal with double precission vfreq values?), that lead me to use that system for differencing modelines. However, this system has a drawback: as variety of video modes reported by Mame xml increases with each new version, you can eventually run out of "x-labels", as fake increased xres reaches the next 8-multiple. Thus, it becomes a need to control the total number of produced modelines - which by the way I must do in Windows to keep the number below the limit - Writing an algorithm that does a good job reducing a resolution list while keeping the most important ones, is not a trivial matter, I'd say it's the most difficult part of the game. I believe lrmc has an implementation of this. There's something more to care about when using this system: Mame tries to center the game frame on the screen, so if you use a fake xres increased more than one pixel you will start loosing pixels on frame's right side, as you'll have a hardware vs logic resolution issue (Mame would work with a logical frame bigger than the screen resolution, so a patch on this would be needed). If SDL thing can be figured out, all this would not be necessary.

It's good news you have doublescan working! I haven't been able to turn it on for custom modelines in Windows (though the card is capable and it works for default modes). This is due to buggy Catalyst drivers.

By the tests done by VeS and me, lrmc seems a little twisted when dealing with low resolutions. We couldn't force it to produce an exact 256x224 resolution for Toki, using lrmc.conf to store our Hantarex settings. Maybe we're missing something?

I've devoted a lot of time to think about the best way of selecting resolutions for games. I don't believe there's a "perfect way". The selection is monitor dependent. In many cases you just have go for the less bad option, and even then it depends on taste. Now I regret all the development I've done is focused on lowres arcade monitors, but I think it won't be difficult to extend the logic to multisync monitors, and it's a good chance to do it. When I have the time I'll sketch the basic idea, it's not so complicated after all.

CRT Emudriver, VMMaker & Arcade OSD downloads, documentation and discussion:  Eiusdemmodi

bitbytebit

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 896
    • The Groovy Organization
Re: Radeon X Driver for ArcadeVGA capability and lrmc .ini file generation
« Reply #48 on: October 12, 2010, 04:03:33 pm »
Hi bitbytebit,

We also have that problem with default video modes appearing on us in Windows. Fortunately custom modelines override default modes when they match. But as room for video modes is very limited (60 by default, 200 with my hacked drivers) and these modes waste space we have to "restrict" them, there's a registry key to do this.

It would be great you got SDL to properly deal with vertical refresh. In Windows we have this "integer bottleneck" in the DirectX functions (who would ever want to deal with double precission vfreq values?), that lead me to use that system for differencing modelines. However, this system has a drawback: as variety of video modes reported by Mame xml increases with each new version, you can eventually run out of "x-labels", as fake increased xres reaches the next 8-multiple. Thus, it becomes a need to control the total number of produced modelines - which by the way I must do in Windows to keep the number below the limit - Writing an algorithm that does a good job reducing a resolution list while keeping the most important ones, is not a trivial matter, I'd say it's the most difficult part of the game. I believe lrmc has an implementation of this. There's something more to care about when using this system: Mame tries to center the game frame on the screen, so if you use a fake xres increased more than one pixel you will start loosing pixels on frame's right side, as you'll have a hardware vs logic resolution issue (Mame would work with a logical frame bigger than the screen resolution, so a patch on this would be needed). If SDL thing can be figured out, all this would not be necessary.

It's good news you have doublescan working! I haven't been able to turn it on for custom modelines in Windows (though the card is capable and it works for default modes). This is due to buggy Catalyst drivers.

By the tests done by VeS and me, lrmc seems a little twisted when dealing with low resolutions. We couldn't force it to produce an exact 256x224 resolution for Toki, using lrmc.conf to store our Hantarex settings. Maybe we're missing something?

I've devoted a lot of time to think about the best way of selecting resolutions for games. I don't believe there's a "perfect way". The selection is monitor dependent. In many cases you just have go for the less bad option, and even then it depends on taste. Now I regret all the development I've done is focused on lowres arcade monitors, but I think it won't be difficult to extend the logic to multisync monitors, and it's a good chance to do it. When I have the time I'll sketch the basic idea, it's not so complicated after all.



I'm thinking about hacking at the actual xserver itself, because then I can just stop it from shoving in extra modelines and also might be better to have it and the radeon driver always at the same level.  It's an idea at least, I'm still not certain how practical it is to have an alternate version of it.  just having the radeon driver makes sense, but the whole xserver is saying there's something really wrong.  Interestingly though doing that would allow me to fix the modeline issue for all cards because that is where we could insert them for anybody.  That is where they seem to have possibly removed the functionality I am guessing, and then told the drivers to all create them with the Vesa cvt program.  Also I have found that it definitely isn't labels, but actual sizes that the SDL layer is commanding the X server to change to, so I can't just change the label.  I have just wrote code to test this changing by 1 pixel the horizontal resolution per refresh rate.  So far seems like there's really no more than up to 4-5 duplicate resolutions, but is concerning about how it has to do it this way with SDL 1.2.  So from what I can tell, only SDL 1.3 fixes the refresh issue, I think it also does the vsync stuff too.  Possibly looking at that next and seeing if it works at all, I think I had SDL 1.3 running before but may not have been using switchres and that might be where it is broken actually.  That of course is the only reason we would want to use it, so then would negate any positives about it.

LRMC is odd, because unless you specify the -n option for no stretch, it will add tons to the low resolutions, then make sure you have -l set too for a 5Mhz pixel clock.  Oddly what happens, is when you set things with the lrmc.conf file you don't get the full changes of a CGA monitor as you would with the predefined -cga option.  Since it only changes those values in the config file and not the other key values for the modeline.  If you can get a few good modelines, or one at least, and put it into lrmc.conf it does better at guessing the frontporch and other values like that.  I definitely think lrmc could use a lot of little fixups in how it calculates things, the doublescan works well with the newest ati radeon driver it seems and was broken with the older version I had (it was a month or two old, they just fixed it in the last couple weeks from the change log).   I have to basically keep lrmc under control and only say try doublescan if the resolution is <= 192, else it would do it for everything it can that wants a large resolution.  It will go out of range of the proper Vsync values willingly when using doublescan, can double what you put into the config or the preset values.  At least from what I can tell, I might be wrong on that, but the xrandr program says they are big.  Also I have tried using xrandr and it is just very unstable and terrible at doing the resolution switching, won't even work with this ati radeon driver possibly because it's newer than the xserver I'm using.  So another reason I want to look at the xserver and see about hacking at it to fix the default resolutions.  overriding them is an option but it get's tricky because then you've got a resolution that isn't incremented by 1 and have to be exact while right now I'm by default incrementing them by one to avoid the default resolutions.  I need to figure out how to avoid that, I have some ideas, probably just make the first label not include the refresh rate which will override the default ones.  Although those default ones are annoying because it'll put 3 sometimes possibly of one resolution all with the basic label like "640x480" and I'm not sure it really will override all 3 if I add one more to that mess.  So hacking the xserver would allow me to trust the resolutions to have the first one at the right Horizontal size and additional ones increment from there.
SwitchRes / GroovyMame - http://arcade.groovy.org
Modeline Generator and Mame Wrapper for Windows or Linux
LiveCD of Groovy Arcade Linux for Arcade Monitors
GroovyMame - generate arcade resolutions like advancemame
--
The Groovy Organization

bitbytebit

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 896
    • The Groovy Organization
Re: Radeon X Driver for ArcadeVGA capability and lrmc .ini file generation
« Reply #49 on: October 12, 2010, 04:51:52 pm »
Hi bitbytebit,

We also have that problem with default video modes appearing on us in Windows. Fortunately custom modelines override default modes when they match. But as room for video modes is very limited (60 by default, 200 with my hacked drivers) and these modes waste space we have to "restrict" them, there's a registry key to do this.

It would be great you got SDL to properly deal with vertical refresh. In Windows we have this "integer bottleneck" in the DirectX functions (who would ever want to deal with double precission vfreq values?), that lead me to use that system for differencing modelines. However, this system has a drawback: as variety of video modes reported by Mame xml increases with each new version, you can eventually run out of "x-labels", as fake increased xres reaches the next 8-multiple. Thus, it becomes a need to control the total number of produced modelines - which by the way I must do in Windows to keep the number below the limit - Writing an algorithm that does a good job reducing a resolution list while keeping the most important ones, is not a trivial matter, I'd say it's the most difficult part of the game. I believe lrmc has an implementation of this. There's something more to care about when using this system: Mame tries to center the game frame on the screen, so if you use a fake xres increased more than one pixel you will start loosing pixels on frame's right side, as you'll have a hardware vs logic resolution issue (Mame would work with a logical frame bigger than the screen resolution, so a patch on this would be needed). If SDL thing can be figured out, all this would not be necessary.

It's good news you have doublescan working! I haven't been able to turn it on for custom modelines in Windows (though the card is capable and it works for default modes). This is due to buggy Catalyst drivers.

By the tests done by VeS and me, lrmc seems a little twisted when dealing with low resolutions. We couldn't force it to produce an exact 256x224 resolution for Toki, using lrmc.conf to store our Hantarex settings. Maybe we're missing something?

I've devoted a lot of time to think about the best way of selecting resolutions for games. I don't believe there's a "perfect way". The selection is monitor dependent. In many cases you just have go for the less bad option, and even then it depends on taste. Now I regret all the development I've done is focused on lowres arcade monitors, but I think it won't be difficult to extend the logic to multisync monitors, and it's a good chance to do it. When I have the time I'll sketch the basic idea, it's not so complicated after all.



I'm thinking about hacking at the actual xserver itself, because then I can just stop it from shoving in extra modelines and also might be better to have it and the radeon driver always at the same level.  It's an idea at least, I'm still not certain how practical it is to have an alternate version of it.  just having the radeon driver makes sense, but the whole xserver is saying there's something really wrong.  Interestingly though doing that would allow me to fix the modeline issue for all cards because that is where we could insert them for anybody.  That is where they seem to have possibly removed the functionality I am guessing, and then told the drivers to all create them with the Vesa cvt program.  Also I have found that it definitely isn't labels, but actual sizes that the SDL layer is commanding the X server to change to, so I can't just change the label.  I have just wrote code to test this changing by 1 pixel the horizontal resolution per refresh rate.  So far seems like there's really no more than up to 4-5 duplicate resolutions, but is concerning about how it has to do it this way with SDL 1.2.  So from what I can tell, only SDL 1.3 fixes the refresh issue, I think it also does the vsync stuff too.  Possibly looking at that next and seeing if it works at all, I think I had SDL 1.3 running before but may not have been using switchres and that might be where it is broken actually.  That of course is the only reason we would want to use it, so then would negate any positives about it.

LRMC is odd, because unless you specify the -n option for no stretch, it will add tons to the low resolutions, then make sure you have -l set too for a 5Mhz pixel clock.  Oddly what happens, is when you set things with the lrmc.conf file you don't get the full changes of a CGA monitor as you would with the predefined -cga option.  Since it only changes those values in the config file and not the other key values for the modeline.  If you can get a few good modelines, or one at least, and put it into lrmc.conf it does better at guessing the frontporch and other values like that.  I definitely think lrmc could use a lot of little fixups in how it calculates things, the doublescan works well with the newest ati radeon driver it seems and was broken with the older version I had (it was a month or two old, they just fixed it in the last couple weeks from the change log).   I have to basically keep lrmc under control and only say try doublescan if the resolution is <= 192, else it would do it for everything it can that wants a large resolution.  It will go out of range of the proper Vsync values willingly when using doublescan, can double what you put into the config or the preset values.  At least from what I can tell, I might be wrong on that, but the xrandr program says they are big.  Also I have tried using xrandr and it is just very unstable and terrible at doing the resolution switching, won't even work with this ati radeon driver possibly because it's newer than the xserver I'm using.  So another reason I want to look at the xserver and see about hacking at it to fix the default resolutions.  overriding them is an option but it get's tricky because then you've got a resolution that isn't incremented by 1 and have to be exact while right now I'm by default incrementing them by one to avoid the default resolutions.  I need to figure out how to avoid that, I have some ideas, probably just make the first label not include the refresh rate which will override the default ones.  Although those default ones are annoying because it'll put 3 sometimes possibly of one resolution all with the basic label like "640x480" and I'm not sure it really will override all 3 if I add one more to that mess.  So hacking the xserver would allow me to trust the resolutions to have the first one at the right Horizontal size and additional ones increment from there.

Just did some tests, got my incrementing code so it only does it when needed, I also had to make it back off and decrease by 1 for duplicates when they were at the maximum horizontal size like 640x480.  Would decreasing it be an option that at least wouldn't go off screen and instead just slightly reduce the picture by a pixel or two?  Basically I am now just making my modeline labels the "HxW" format to try and beat the default ones X inserts (there's only two really with my Hsync/Vsync values low enough, there's a  640x480 and 320x240, which work but the 320x240 one has too high of Vrefresh and also is off center).  It looks like from what I can tell I can't really override the default ones this way, it'll just force them in there still and then it's up the the sorting process which one wins out and is on top. The duplicating method beats them it seems because they are always one pixel more and it seems to sort them by H values.  I guess that's a downfall of the decreasing method, unless we can really knock those things out of there.

Also I've found that lrmc really is no good at anything besides EGA/VGA calculations, the VGA are off and then the SVGA are way off.  The default resolution X windows uses seems to have been what overrides my default screen for X normally, because when I incremented them it got all warped and it turns out that the lrmc VGA resolutions are just bad.  It also shows how X aggressively replaces and prioritizes the resolutions it has over your own, at least with that one it did because it's always been used even when I thought it couldn't be.  This I guess is yet another sign I might need to change the xserver itself, which I know where exactly the code is to do it.  Calling that function from the Radeon driver itself and redoing it might work, but I'm not sure all the code from that function can port into the Radeon driver and still work.  It's the basic X xf86InitialConfiguration() function where it takes are modelines we added and the default ones it chose for us and combines them all together.
SwitchRes / GroovyMame - http://arcade.groovy.org
Modeline Generator and Mame Wrapper for Windows or Linux
LiveCD of Groovy Arcade Linux for Arcade Monitors
GroovyMame - generate arcade resolutions like advancemame
--
The Groovy Organization

bitbytebit

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 896
    • The Groovy Organization
Re: Radeon X Driver for ArcadeVGA capability and lrmc .ini file generation
« Reply #50 on: October 13, 2010, 04:00:51 am »
Version .9 is up, now on sourceforge or my website since it got larger with the additional optional xserver source with default modeline setup removed to avoid useless ones we don't ever use on an arcade monitor.  It also has an xrandr resolution switching wrapper script that solves the issue with having multiple resolutions with the same height and width but different refresh rate.  Lots of bug fixes, read the first post on the thread for more details.
SwitchRes / GroovyMame - http://arcade.groovy.org
Modeline Generator and Mame Wrapper for Windows or Linux
LiveCD of Groovy Arcade Linux for Arcade Monitors
GroovyMame - generate arcade resolutions like advancemame
--
The Groovy Organization

Calamity

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 5469
Re: Radeon X Driver for ArcadeVGA capability and lrmc .ini file generation
« Reply #51 on: October 13, 2010, 04:40:24 am »
Hi bitbytebit,

I'ts so good you figured out how to fix the built-in modelines issue and also included xrandr functionallity.

It's funny that the x server messes with modelines, that's not what it's supposed to do according to this:

http://www.x.org/archive/X11R6.8.1/doc/xorg.conf.5.html

Quote
The Identifier entry specifies the unique name for this monitor. The Monitor section provides information about the specifications of the monitor, monitor-specific Options, and information about the video modes to use with the monitor. Specifying video modes is optional because the server now has a built-in list of VESA standard modes. When modes are specified explicitly in the Monitor section (with the Modes, ModeLine, or UseModes keywords), built-in modes with the same names are not included. Built-in modes with different names are, however, still implicitly included.

As for the increasing/decreasing method, I think decreasing is not an option, as you must consider xres to be measured by characters (8 pixel block) instead of pixels. When passing xres values to the hardware, at some point I believe Xres is rounded to its lower 8-multiple, as VGA registers work with characters (btw, that's why the horizontal part of the modeline must use 8-multiples). So if you decrease xres you might loose 8 pixels instead of one (I haven't tested this).

« Last Edit: October 13, 2010, 04:47:00 am by Calamity »
CRT Emudriver, VMMaker & Arcade OSD downloads, documentation and discussion:  Eiusdemmodi

bitbytebit

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 896
    • The Groovy Organization
Re: Radeon X Driver for ArcadeVGA capability and lrmc .ini file generation
« Reply #52 on: October 13, 2010, 04:46:54 am »
Hi bitbytebit,

I'ts so good you figured out how to fix the built-in modelines issue and also included xrandr functionallity.

It's funny that the x server messes with modelines, that's not what it's supposed to do according to this:

http://www.x.org/archive/X11R6.8.1/doc/xorg.conf.5.html

Quote
The Identifier entry specifies the unique name for this monitor. The Monitor section provides information about the specifications of the monitor, monitor-specific Options, and information about the video modes to use with the monitor. Specifying video modes is optional because the server now has a built-in list of VESA standard modes. When modes are specified explicitly in the Monitor section (with the Modes, ModeLine, or UseModes keywords), built-in modes with the same names are not included. Built-in modes with different names are, however, still implicitly included.

As for the increasing/decreasing method, I think decreasing is not an option, as you must consider xres to be measured by characters (8 pixel block) instead of pixels. When passing xres values to the hardware, at some point I believe a "Xres MOD 8" operation is performed, as VGA registers work with characters (btw, that's why the horizontal part of the modeline must use 8-multiples). So if you decrease xres you might loose 8 pixels instead of one (I haven't tested this).


Yeah, this xrandr stuff is just really strange though because it worked pretty well then crashed and after reboot it really didn't act like I expected.  So I guess it's some hope, but now looking like 50% better than waiting for SDL 1.3 to arrive, which claims to fix the vsync issues (although seems if it isn't working yet then it hasn't yet, but that must be why it's taking so long).  The incremental stuff at least works quite well, I get the best overall look of the games from it so far, definitely my favorite right now.  I don't know if this xrandr is really unstable or I just need to update all of my X Windows to the current versions, but that seems pretty crazy and not expecting everyone to do that because it's a big task.
SwitchRes / GroovyMame - http://arcade.groovy.org
Modeline Generator and Mame Wrapper for Windows or Linux
LiveCD of Groovy Arcade Linux for Arcade Monitors
GroovyMame - generate arcade resolutions like advancemame
--
The Groovy Organization

elvis

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 1129
  • penguin poker
    • StickFreaks
Re: Radeon X Driver for ArcadeVGA capability and lrmc .ini file generation
« Reply #53 on: October 16, 2010, 08:29:00 am »
I just want to say thanks especially to Calamity and bitbytebit for all their open discussion in this thread.  It's explained a hell of a lot for me, and why I've been having so many dramas with recent Xorg builds (the forced 60.00Hz issue and Xorg ignoring my modeline refresh was driving me nuts).

I'll be playing with xrandr over the next week myself (based on the perl script in the GenRes stuff) to see if that solves my issue.  I'm using NVidia hardware and hacked nv_drv.so myself to force pclocks lower than 12MHz (which is hard set in the nv drivers). 

But yeah, thanks again to you both.  This thread has shone a lot of light on a few issue for me. 

bitbytebit

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 896
    • The Groovy Organization
Re: Radeon X Driver for ArcadeVGA capability and lrmc .ini file generation
« Reply #54 on: October 16, 2010, 12:38:09 pm »
There's a lot of brick walls it seems in linux for Arcade monitors, which I'm finding and working on breaking.  First the Xorg servers all seem to not have options for CGA/EGA or anything below 31Khz, plus the pclock lock and moving away from modelines and to CVT.  On top of that I'm finding that there is also the kernel framebuffer which has similar issues where the drivers force a higher minimum pclock.  Last night I changed uvesafb to allow a lower pclock and finally got mame to modeswitch using an fb.modes file to arcade resolutions.  Still working out issues there, but it at least finally responded unlike the default vesa framebuffer which is useless for an arcade monitor or the radeon one which seems to ignore my ArcadeVGA card (still not sure why about that).  Also there's the fact we aren't into the Vertical Refresh rate being accurate till SDL 1.3 which currently the newest build is broken with mame.  I have a patch which makes it work, and it's really nice how it truly can choose the right modeline by both resolution and refresh rate.  It also has a vsync support I guess, so it all looks neat.  The only issue though is they haven't implemented the ability to hide the mouse cursor or turn it into a full  non-limited mouse.  So any game needing a mouse, trackball, spinner has limits of the mouse movement to the sides of the screen and you see a huge mouse all the time.  SDL 1.3 though is nice for games like pacman or any other simpler classic games, just avoids the incrementing the horizontal frame size hack.  I've got a newer version of genres coming which I'm now mainly working on besides some odds n ends for bug fixes, getting the vertical refresh rates as perfectly accurate as possible (to avoid needing as many vsync hacks).  Of course this really is mostly geared at Wells Gardner D9x00 monitors which can somewhat go into odd Horizontal refresh rates above 15.725 so I can actually run pacman at Vertial 60.61Hz and 252x288 by going into 19.2Khz horizontal refresh rates (which I've really notices the games act so much nicer when they match the right refresh rate). With this I've also been working on learning the modeline voodoo to be able to improve lrmc, studying the modelines from Soft 15Khz to try and learn what they are doing because those modelines are always really nice examples to go by.  I have actually found some tweaks to lrmc which seems like it is creating modelines much more like the ones I see people using in Windows/Soft15Khz, one is the vertical game aspect ratio/horizontal width sizing which in lrmc is different and I think I figured out how to correct it.  I'm still trying to figure out though how to make sure the refresh rates are good enough for people with just arcade monitors, yet be able to get them exact for people with this multisync able to do 15-40Khz WG type monitors.  Hopefully I post a new genres by the end of the weekend with those general fixes, and the SDL 1.3 patch to mame for those who want to play with it.
SwitchRes / GroovyMame - http://arcade.groovy.org
Modeline Generator and Mame Wrapper for Windows or Linux
LiveCD of Groovy Arcade Linux for Arcade Monitors
GroovyMame - generate arcade resolutions like advancemame
--
The Groovy Organization

Calamity

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 5469
Re: Radeon X Driver for ArcadeVGA capability and lrmc .ini file generation
« Reply #55 on: October 16, 2010, 06:04:32 pm »
Hi there,

I've been out for some days and, I'll try to catch up with the new info. I'd like to thank bitbytebit again and the others.

These days I've been thinking about the vfreq ignoring issue with SDL 1.2. There is a way to overcome it, I believe. Maybe you've already tested this and I didn't follow. We can use a resolution table for xorg.conf, without explicit vfreq. This table would be created condensating the whole Mame video mode list by omitting vfreq values. Mame inis would just include WxH values. For each of these resolutions we should create a sample modeline, (say 60Hz) which will be included in xorg.conf. Mame already knows the valid vfreq values for each game, so we just need to patch Mame so it invokes lrmc for the right values before it tryes to switch video mode. A new "throw-away" modeline would be created which should be made available for Radeon driver by any means (could it overwrite the sample one in xorg.conf?). Now, Mame would invoke video mode switching without pushing vfreq. As there is just one modeline defined for this resolution, the one we just created, the proper video mode should be invoked.

I'm not sure how this can colide with the built-in video mode list, but the general idea of having a fixed resolution table with default vfreq values and tuning the existing mode to the refresh we request before switching video mode is the approach I'd like to implement for Windows in the near future.

CRT Emudriver, VMMaker & Arcade OSD downloads, documentation and discussion:  Eiusdemmodi

Calamity

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 5469
Re: Radeon X Driver for ArcadeVGA capability and lrmc .ini file generation
« Reply #56 on: October 16, 2010, 07:10:52 pm »
I'm still trying to figure out though how to make sure the refresh rates are good enough for people with just arcade monitors, yet be able to get them exact for people with this multisync able to do 15-40Khz WG type monitors.

I've never touched a multisync monitor, I need to get one of these WG. I wonder if 15-40KHz sync range is continuous or it has some gaps (I think so). If you have several separate intervals, I think the best approach would be to consider each one as a single monitor, with it's own HfreqMin - HfreqMax valid interval, then choose the best one for the mode we want to produce, or the only one we have if it's a non multisync monitor. Otherwise you'll get a lot of hardcoded conditionals for each particular case. The definitive approach is to produce our modeline for all of those virtual monitors: for some of them it'll be necessarily degradated (doublescaned, interlaced-stretched, bad vfreq, bad res), depending on the valid hsync interval and the general conditions we set in the algorithm, so we'll get a score that accounts for this degradation. This score will be used to choose the best modeline and discard the others. I'd like to go more into this when I have some time. If you need any help with modeline calculation just let me know.
« Last Edit: October 16, 2010, 07:14:44 pm by Calamity »
CRT Emudriver, VMMaker & Arcade OSD downloads, documentation and discussion:  Eiusdemmodi

bitbytebit

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 896
    • The Groovy Organization
Re: Radeon X Driver for ArcadeVGA capability and lrmc .ini file generation
« Reply #57 on: October 16, 2010, 10:27:55 pm »
I'm still trying to figure out though how to make sure the refresh rates are good enough for people with just arcade monitors, yet be able to get them exact for people with this multisync able to do 15-40Khz WG type monitors.

I've never touched a multisync monitor, I need to get one of these WG. I wonder if 15-40KHz sync range is continuous or it has some gaps (I think so). If you have several separate intervals, I think the best approach would be to consider each one as a single monitor, with it's own HfreqMin - HfreqMax valid interval, then choose the best one for the mode we want to produce, or the only one we have if it's a non multisync monitor. Otherwise you'll get a lot of hardcoded conditionals for each particular case. The definitive approach is to produce our modeline for all of those virtual monitors: for some of them it'll be necessarily degradated (doublescaned, interlaced-stretched, bad vfreq, bad res), depending on the valid hsync interval and the general conditions we set in the algorithm, so we'll get a score that accounts for this degradation. This score will be used to choose the best modeline and discard the others. I'd like to go more into this when I have some time. If you need any help with modeline calculation just let me know.


I've asked the Wells Gardner support guy about the range, and if it's continuous or if not what are the tolerance levels.  The lrmc guy seems to have gone through this with the 9200 but it was different and the 9400/9800 seem to have both extended the range to include SVGA and also seems more robust about handling that entire range.  In lrmc it is built to do this, basically can define how ever many ranges of details and it scores each and usually seems to pick the right one.  So this is a good thing about lrmc, the trick is that you can't really define them in the config like the hard coded structures in linked list setup used for the -cga/ega/d9200 etc. predefines.  I actually think the way it is setup in lrmc should in theory be pretty good at working for basic CGA monitors where you know the values, as long as a user can input the complete setup into the config or on the command line.  There are details with the calculations for the 9200/9800 I'm trying to work out and the Wells Gardner guy hopefully will help with that, knowing the true limits.  Although I can make it do anything in that range and it seems to work fine, I just want to make sure they approve.  I think in lrmc the main issues with it's calculations are the centering or back/front porch calculations, and that may just be when it comes to these Wells Gardner type monitors since they are very tricky to pin down what they are since they are all types of monitors in one.

The X windows thing is interesting, having mame create the modelines itself.  The tricky part though is, at least from what I've seen, the stuff that dynamically puts modelines into X windows isn't stable, kind of slow, and often crashes it.  I've seen it using xrandr and also read threads of people saying that xrandr often will either not work or crash X randomly on them.   The other issue is that the whole window centering and general management of SDL onto the X display doesn't work as well outside of letting SDL do it.  I can use xrandr, but then I have to really work to get the games to position themselves inside the visible screen area since X likes to be virtual windowed when changing modes to smaller display sizes.  I am guessing SDL takes care of all this usually, and without it doing that part things just seem way less stable and smooth.  SDL 1.3 really looks exciting because of that, and the Vsync stuff it seems to have, although I'm not sure if it's 100% perfect or as good as triplebuffer (although I've read that seems to be slightly imperfect at times for some too).  Also I saw logs on the Soft 15Khz threads which seem to indicate that in Windows it can actually utilize the refresh rate of modelines, at least seemed like it prints out the information which it does for SDL 1.3 but not for SDL 1.2. So I assumed doing that it was  able to at least pick it somewhat, much better than totally ignoring it in SDL 1.2 and X just picking the first modeline that matches the height and width.  Sounds like the best solution would be to dynamically create them for X though, if only X windows was stable in that area.  It seems like the right way to do it according to the X windows stuff, use xrandr protocol and add/choose them that way, and would think that SDL at least could do that for us and we wouldn't need modelines or mess with them in mame at all either.  I think SDL is doing it the most stable current way though and that seems to require modelines, I've looked at the idea of porting SDL 1.3 code into 1.2 just to include Vsync and/or taking code from SDL 1.3 and including it in mame.  They look really complicated though, both options would require a lot of heavy lifting and end up having something probably hard to maintain through releases and become out dated as soon as SDL 1.3 is done.  It really seems like SDL 1.3 is close with just needing the mouse hiding to be done, but then again I'm not sure, but from the mailing lists it looks like it may be settling down sooner or later into a stable fully working release.  Is using SDL on Windows a bad thing compared to the other options there?  Would think that SDL 1.3 would fix both Linux and Windows, if it also works well in Windows that is.  Possibly just adding some type of fixes to be like triplebuffer or proper vsync.  I also have read that the vsync issues are really an X windows one and they have indicated in a future release they will have some option for vsync with the X server.  I'm not sure if that's in newer ones, distributions use 1.7.7 currently and they are on 1.9.0 right now so not sure if it's been done yet (and 1.9.0 doesn't compile the radeon driver so guessing it's not stable and not all parts have caught up to it yet).
SwitchRes / GroovyMame - http://arcade.groovy.org
Modeline Generator and Mame Wrapper for Windows or Linux
LiveCD of Groovy Arcade Linux for Arcade Monitors
GroovyMame - generate arcade resolutions like advancemame
--
The Groovy Organization

Calamity

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 5469
Re: Radeon X Driver for ArcadeVGA capability and lrmc .ini file generation
« Reply #58 on: October 17, 2010, 05:53:53 pm »
Is using SDL on Windows a bad thing compared to the other options there?  Would think that SDL 1.3 would fix both Linux and Windows, if it also works well in Windows that is.  Possibly just adding some type of fixes to be like triplebuffer or proper vsync.  I also have read that the vsync issues are really an X windows one and they have indicated in a future release they will have some option for vsync with the X server.

I'm right now testing Win32 SDLMame in my laptop, after some tweaking it's running perfectly smooth and vsyncronized, the right settings in Windows seem to be: video opengl, keepaspect 0, unevenstretch 0, waitvsync 1, throttle 0, switchres 1. If 'video soft' is used, I can't get vsync working. I still have to test it using custom video modes in my cabinet, but it's quite probable it can replicate DirectX functionallity managing modes. However, there's still this inherent limitation of integer values for refresh rates, imposed by the system and drivers inner calls. In Windows, any API we use for going fullscreen will be conditioned by that.

From Windows point of mind, Linux way of using xrandr method seems strange, it looks like resizing the desktop before maximizing the window, affecting all applications. Although we have Win32 APIs to resize the desktop resolution, they are not used in this context, DirectX api (and SDL I suppose) are used to switch video modes instead and use the screen in exclusive mode, so you can access the advanced video features. The method I suggested would just use plain SDL to go fullscreen, i.e. "320x224", but we'll make sure the proper refresh will be used by making an unique modeline available to the driver at the 320x224 resolution, so it should not fail... anyway this is just theory.
CRT Emudriver, VMMaker & Arcade OSD downloads, documentation and discussion:  Eiusdemmodi

bitbytebit

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 896
    • The Groovy Organization
Re: Radeon X Driver for ArcadeVGA capability and lrmc .ini file generation
« Reply #59 on: October 18, 2010, 02:43:19 am »
Is using SDL on Windows a bad thing compared to the other options there?  Would think that SDL 1.3 would fix both Linux and Windows, if it also works well in Windows that is.  Possibly just adding some type of fixes to be like triplebuffer or proper vsync.  I also have read that the vsync issues are really an X windows one and they have indicated in a future release they will have some option for vsync with the X server.

I'm right now testing Win32 SDLMame in my laptop, after some tweaking it's running perfectly smooth and vsyncronized, the right settings in Windows seem to be: video opengl, keepaspect 0, unevenstretch 0, waitvsync 1, throttle 0, switchres 1. If 'video soft' is used, I can't get vsync working. I still have to test it using custom video modes in my cabinet, but it's quite probable it can replicate DirectX functionallity managing modes. However, there's still this inherent limitation of integer values for refresh rates, imposed by the system and drivers inner calls. In Windows, any API we use for going fullscreen will be conditioned by that.

From Windows point of mind, Linux way of using xrandr method seems strange, it looks like resizing the desktop before maximizing the window, affecting all applications. Although we have Win32 APIs to resize the desktop resolution, they are not used in this context, DirectX api (and SDL I suppose) are used to switch video modes instead and use the screen in exclusive mode, so you can access the advanced video features. The method I suggested would just use plain SDL to go fullscreen, i.e. "320x224", but we'll make sure the proper refresh will be used by making an unique modeline available to the driver at the 320x224 resolution, so it should not fail... anyway this is just theory.


Are you testing SDL 1.3 or 1.2 in Windows?  I can't get the throttle setting to work at 0, it just goes full speed in Linux and I'm using 1.2 opengl and also tried 1.3 but have been using -video sdl13 there. 

There's quite a few bugs I've found with my method of width size incrementing, but have fixed them somewhat and now it kind of works as a work around but definitely hopeful it can be avoided. 

I also found a big bug/memory leak in the radeon driver I introduced by the way I was getting the modelines, I think that was the instability I was seeing in xrandr since it calls that function each and every call to re-add the modelines.  I am pretty sure it should work smooth now without issue, and it sounds interesting to just add the modeline for each game upon start and remove after finish.  My perl script would be easy to do this with, using lrmc to calculate the modeline add it and switch to it, switch back to the default and delete that modeline on exit.  This seems like the cleanest way to do things to me, and avoid needing to patch mame as much as patching the radeon driver and possibly the xserver  to avoid annoying extra modelines that override custom ones.  It would be nice to have mame call the xrandr stuff itself but I get the feeling it's really tied into SDL  and we'd get the same functionality through the perl script method, and avoid having to pack lrmc and xrandr into mame code and maintain it.  Plus SDL 1.3 pretty much will do the same, and should be able to use  refresh rates as doubles (I think my patch allows this, but I haven't inspected it to check for SDL possibly having the limitation to integers).  Again it goes back I guess to using the xrandr perl script and external lrmc being the most universal solution if SDL 1.3 still doesn't allow it to perfectly call xrandr functions. 

   
Hopefully tomorrow I'll have an updated genres with the fixed radeon drivers memory leak, lrmc that increments horizontal size for SDL 1.2, and possibly try to get the xrandr perl script to dynamically generate and force modelines from the ini files (which should be able to avoid the ini files, maybe build a DB of resolutions from the mame.xml for it to use so it knows each games needs).  I have also possibly gotten lrmc to output a lot better aspect ratios for the vertical games, which before I think the calculation for that was kinda strange.  It at least never matched the modelines I saw others generate and made it hard to match refresh rates without getting odd sizes.  For some reason it divided the horizontal size on a vertical game by .5625 and now I'm multiplying the vertical size by 1.33333 for a 4x3 screen (which seems to work better, really curious though, seems a lot of people use 1.22222 and not sure why).  Here's the code basically how that's done...
Code: [Select]
if (mode->aspect3x4 == 1) {
+      /* Vertical games */
+      //mode->hres = mode->hres / 0.5625;
+      mode->hres = align(mode->vres * (1.33333), 8);
+    } else if (!mode->aspect3x4 && (mode->hres < mode->vres)) {
+      /* Odd games */
+      if (mode->refreshrate <= 30.5) {
+        mode->refreshrate = mode->refreshrate * 2.0;
+       //mode->vres = align(mode->vres / 2.0, 8);
+      } else {
+       //mode->hres = mode->hres / 0.5625;
+        mode->hres = align(mode->vres * (1.33333), 8);
+      }
+    }
SwitchRes / GroovyMame - http://arcade.groovy.org
Modeline Generator and Mame Wrapper for Windows or Linux
LiveCD of Groovy Arcade Linux for Arcade Monitors
GroovyMame - generate arcade resolutions like advancemame
--
The Groovy Organization

bitbytebit

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 896
    • The Groovy Organization
Re: Radeon X Driver for ArcadeVGA capability and lrmc .ini file generation
« Reply #60 on: October 18, 2010, 06:35:31 am »
Is using SDL on Windows a bad thing compared to the other options there?  Would think that SDL 1.3 would fix both Linux and Windows, if it also works well in Windows that is.  Possibly just adding some type of fixes to be like triplebuffer or proper vsync.  I also have read that the vsync issues are really an X windows one and they have indicated in a future release they will have some option for vsync with the X server.

I'm right now testing Win32 SDLMame in my laptop, after some tweaking it's running perfectly smooth and vsyncronized, the right settings in Windows seem to be: video opengl, keepaspect 0, unevenstretch 0, waitvsync 1, throttle 0, switchres 1. If 'video soft' is used, I can't get vsync working. I still have to test it using custom video modes in my cabinet, but it's quite probable it can replicate DirectX functionallity managing modes. However, there's still this inherent limitation of integer values for refresh rates, imposed by the system and drivers inner calls. In Windows, any API we use for going fullscreen will be conditioned by that.

From Windows point of mind, Linux way of using xrandr method seems strange, it looks like resizing the desktop before maximizing the window, affecting all applications. Although we have Win32 APIs to resize the desktop resolution, they are not used in this context, DirectX api (and SDL I suppose) are used to switch video modes instead and use the screen in exclusive mode, so you can access the advanced video features. The method I suggested would just use plain SDL to go fullscreen, i.e. "320x224", but we'll make sure the proper refresh will be used by making an unique modeline available to the driver at the 320x224 resolution, so it should not fail... anyway this is just theory.


I have xrandr working great now, doing it the way your suggesting and dynamically creating the modelines when starting each game so it knows what it will get.  This seems to work great, and after fixing the radeon driver memory leak it's very fast and not unstable anymore.  It works better than anything I've seen yet and very good a getting the vertical refresh as close as possible depending on the monitor.  Still need to look into if SDL 1.3 really fixes Vsync when it's not exact, or what other options there are in Linux SDL.  I do see this option "-refreshspeed        automatically adjusts the speed of gameplay to keep the refresh rate lower than the screen" which looks like it might help with that when games aren't able to get a perfect refresh rate.  Also I have an xrandr patch, and it requires the newest GIT of xrandr, because older versions used integers for the dotclock modeline value and the newest doesn't but I modified it to really get it exact, was a slight bit of rounding decimal places still.  There's an xrandr.diff patch for that now, and the mame_xrandr.pl script takes the game and monitor type like "mame_xrandr.pl pacman -m d9800" or if no monitor is given it uses lrmc.conf.  It does take advancemame type clock lines supposedly to get the right blanking times, so that is something interesting rather than using example modelines.  The source seems to have that info in it, haven't seen that talked about anywhere else and using it kinda worked for me but either I had a bad line for it or it's a bit buggy.  I think in general the xrandr method is really a good way to go, avoids messy modelines and does exactly like you had thought in really forcing the exact resolution.  I'm thinking that building this into MAME might be interesting, but then again it also might really not get us much more than we have using a perl script.  Also you still have to create the .ini files, I'm thinking that's best because it's basically a database anyways and we are actually letting MAME do the modeswitching and just adding modelines with X to have mame find.  Also the .ini files have extra info for games that need keepaspect and unevenstretch when they either are at the default resolution or can't calculate a modeline which has a large enough vertical height for them.  This version really ought to be interesting because it brings in a lot more possibilities using xrandr, and also makes it a lot easier and might eventually allow us to drop needing any modified X servers/drivers (I'm not fully sure they are needed anymore, probably nicer for an arcade monitor since the modified ones prevent stray odd  resolutions from being generated, but then again most of those can be removed by just setting the Vsync/Hsync values correctly in xorg.conf.  Although with a WG 9800 since it has a big range X goes a bit crazy with the defaults).

Also another thing I can't figure out is how to get good console support, all the frame buffer drivers seem aimed at vesa from the bios and will not allow fully alterable modelines.  I think they actually are forcing the pixelclock to be a decimal or something, same as old xrandr versions did.  The uvesafb seems close in working, the radeonfb seems not to detect the newer ArcadeVGA cards since they have RV600 chips and the linux fb driver only goes up to the RV400 chips.  So my linux console is always a bit out of wack for even a d9800 and I'm sure a normal arcade monitor would just hate the mode it's in.  Not sure how this is done in the distributions exactly for arcade linux systems, but it's something I've been trying to figure out too.

So your theory looks pretty good following my xrandr perl script prototype/working model :).   I'm quite amazed actually, because I had been seeing unstableness with xrandr and the modelines weren't right but those ended up being the memory leak and the older xrandr versions.  Once fixing those two things, xrandr works great (plus figuring out how to exactly use the utility, the perl script definitely is interesting in doing all the dirty work for you to get xrandr to add/delete modelines).

Version .10 is up, and it hopefully is working well with xrandr, even decent with the sdl 1.2 hres fix although I can see what you mean about the screen getting pushed right more and more.  I did do something where I recalculate the modeline with the new values without having it align to 8 bytes.  I found a problem though in general with this method where you basically in mame get a chopped right side even on vertical games of the difference of 8 pixel alignment.  So I guess that makes the xrandr even better and necessary, otherwise you would have to move 8 pixels per modeline to really keep mame from chopping pixels.
SwitchRes / GroovyMame - http://arcade.groovy.org
Modeline Generator and Mame Wrapper for Windows or Linux
LiveCD of Groovy Arcade Linux for Arcade Monitors
GroovyMame - generate arcade resolutions like advancemame
--
The Groovy Organization

bitbytebit

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 896
    • The Groovy Organization
Got confirmation from Wells Gardner about the D9800 Horizontal Freq ranges.  He says they are not fixed, no danger to the monitor, and basically it can do anywhere in the range of 15.250 - 40.00 Khz.  Pretty interesting, seems it isn't like the d9200 then where it sounds like it's fixed points for CGA/EGA/VGA areas.  That's good because I've been really getting close exact vertical refresh rates for most games now by allowing the Horiz Khz to go up to  19Khz, so with the d9800 it should be able to natively handle the vsync issues just by setting the modeline up right.  I need to figure out how to get a few of the modelines that get up closer to 20Khz from being skewed left a little, although it's mostly vertical games like pacman where it's not a big issue.  I seem to possibly need an extra displaymode in lrmc for the range right above CGA mode between that and EGA mode with different horizontal timing for the blanking and stuff.  I have a set where I changed the horizontal values and it seems to shift it back but I need to separate it and have it only used for those specific areas in 17-20Khz since when it's used for lower ones then it can skew them the opposite way.
SwitchRes / GroovyMame - http://arcade.groovy.org
Modeline Generator and Mame Wrapper for Windows or Linux
LiveCD of Groovy Arcade Linux for Arcade Monitors
GroovyMame - generate arcade resolutions like advancemame
--
The Groovy Organization

bitbytebit

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 896
    • The Groovy Organization
Version 0.11   Now makes this a much simpler solution, and universal for all graphics cards, best for ones with low dotclock abilities. 
* no longer need any .ini files or modelines in the xorg.conf file, can dynamically generate modelines and put them into X with xrandr, removing them when done.
* now can run any emulator with the dynamic modelines and switchres wrapper.
* removed all the Xorg/Radeon stuff, just patches for each now since it's not really necessary to install/patch X anymore.
* big improvements to lrmc decisions in modeline creation, good general modeline creator and should be useful for Soft15Khz modeline creation or any other system.
* much easier to get to working immediately, simpler, just install the modified lrmc, install the newest xrandr wtih my patch applied and wrap MAME with switchres or MESS or any other emulator.
* should be much easier to eventually get this working in Windows too, just need to figure out ways to add custom modelines dynamically to the drivers and remove them I guess.  The lrmc
   program compiles under Windows and switchres is Perl so will run in Windows.  Not working yet, but it is a goal to eventually make this a universal dynamic modeline generator across all platforms
   for emulators to use.
* can attach zip file again to this message because it's now very small from not including all the extra X org stuff.
SwitchRes / GroovyMame - http://arcade.groovy.org
Modeline Generator and Mame Wrapper for Windows or Linux
LiveCD of Groovy Arcade Linux for Arcade Monitors
GroovyMame - generate arcade resolutions like advancemame
--
The Groovy Organization

bitbytebit

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 896
    • The Groovy Organization
I've wondered about the games not all having correct information, definitely can see some that the info just doesn't seem right and it's hard to get the to display nicely.  Also the pixelclock values are there for some, but when I try to use that in my calculations which I can do in lrmc as a minimum, it just makes some games become too small on the screen.

Pixelclock values in mame.xml as well as porch values are precious information by their own, and also if we intended to replicate the exact original video signal, but I'm afraid this is not the way to go if we want to have hundreds of games simultaneously centered on the monitor, so we must redefine the pixelclock for each video mode resulting for the porchs (borders) we want to have, which should be as constant as possible among different video modes, to avoid having to be adjusting our monitor all the time (this is only possible with horizontal borders, vertical amplitud must be adjusted manually). At the same time we have to keep our vertical frequency as close as possible to the original.

It's definitely fun to work on this, started this arcade cabinet as a project for myself and had to use Linux cause that's all I ever have used for anything, which has led me on a mission to share what I am doing for myself which I want to basically make this stuff work on Linux as good as Windows.  Oddly I'm surprised that it seems we probably already  have surpassed Windows in some ways because of the ability to really access the hardware directly through the X server, have unlimited modelines, and not have any doors permanently closed so pretty much should only be limited by what the hardware really can do and time it takes to program it.

Definitely that functionallity is not available in Windows. Well, I have managed to trick the Catalsyt video driver to "reset" itself on the fly so it will read the registry modelines without the need of rebooting, so I would eventually have unlimited modelines also in Windows, but I still have to figure out how to make this available for different emulators (I believe the only option is a sort of loader), and definitely your Linux method is much cleaner and straightforward.

My biggest concern with Linux and the X Radeon driver is if you can really achieve a proper Vsync functionallity. I have heard there were problems with these cards and vsync, I hope it's been solved. Some folks had problems with this in the Spanish forum:

http://www.retrovicio.com/foro/showthread.php?t=8250&highlight=investigaci%C3%B3n

Thanks for the good work!





Guessing from reading this, that you could probably trick the radeon driver in Windows to take the lrmc generated modeline of my new switchres script and use your method in Windows instead of xrandr in Linux.  This sounds quite interesting, do you think other Windows graphics drivers can be tricked this way, or have they, how exactly would I need to go about adapting switchres to do this when run in windows?  From the amazing functionality I'm having in Linux with switchres now, being able to generate modelines dynamically and get pretty good modelines and not rely on .ini files, seems this would be great now combined with your ability to push modlines into the radeon driver in Windows.  If only this could be done for even more video cards in Windows too, then we'd really have something extra interesting.  If we can make switchres work exactly the same in both Windows and Linux, that would be great, and I think at least with the radeon driver this seems very possible.
SwitchRes / GroovyMame - http://arcade.groovy.org
Modeline Generator and Mame Wrapper for Windows or Linux
LiveCD of Groovy Arcade Linux for Arcade Monitors
GroovyMame - generate arcade resolutions like advancemame
--
The Groovy Organization

elvis

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 1129
  • penguin poker
    • StickFreaks
I'm having troubles patching XRandR with the 0.11 download above.

I do:

# unzip GenRes-0.11.zip
# cd GenRes-0.11
# git clone git://anongit.freedesktop.org/xorg/app/xrandr
# cd xrandr
# git apply --stat ../patches/xrandr.diff
 xrandr.c |   18 ++++++++++--------
 1 files changed, 10 insertions(+), 8 deletions(-)
# git apply --check ../patches/xrandr.diff
error: patch failed: xrandr.c:1426
error: xrandr.c: patch does not apply

Have I done something wrong?
« Last Edit: October 20, 2010, 08:05:58 am by elvis »

bitbytebit

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 896
    • The Groovy Organization
I'm having troubles patching XRandR with the 0.11 download above.

I do:

# unzip GenRes-0.11.zip
# cd GenRes-0.11
# git clone git://anongit.freedesktop.org/xorg/app/xrandr
# cd xrandr
# git apply --stat ../patches/xrandr.diff
 xrandr.c |   18 ++++++++++--------
 1 files changed, 10 insertions(+), 8 deletions(-)
# git apply --check ../patches/xrandr.diff
error: patch failed: xrandr.c:1426
error: xrandr.c: patch does not apply

Have I done something wrong?

Try to patch it like this when in the xrandr directory...


cat ../patches/xrandr.diff | patch -p1 -E -l

This most likely should do it, I think I did something odd there and that avoids whitespaces and such.  I'll maybe include xrandr from now on because it's small and actually would like to sort of merge it and lrmc somewhat perhaps in functionality in the future.


Thanks for catching that and letting me know the patch is a bit off :)
SwitchRes / GroovyMame - http://arcade.groovy.org
Modeline Generator and Mame Wrapper for Windows or Linux
LiveCD of Groovy Arcade Linux for Arcade Monitors
GroovyMame - generate arcade resolutions like advancemame
--
The Groovy Organization

elvis

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 1129
  • penguin poker
    • StickFreaks
cat ../patches/xrandr.diff | patch -p1 -E -l
Yup, the "-l" flag fixed it.  Cheers.  I'd also tried "patch" before without "-l" and it failed too, but all good now. 

elvis

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 1129
  • penguin poker
    • StickFreaks
I have one lingering problem.  I don't think it's related to your switchres program, but something else.

I have a default 640x480@60Hz interlaced modeline in Xorg (if I have no modeline, Xorg won't start).  Running Ubuntu 10.04.

Whether I run switchres, or put the following in .xinitrc manually:

xrandr --newmode "416x241"  7812000 416 424 464 496 241 245 248 264 -HSync -VSync
xrandr --addmode default "416x241"
xrandr --verbose --output default --mode "416x241"
mame -verbose -resolution 416x241x32@59.66 mvsc

I get the following output in my log file (with a few extra "xrandr -q" statements to make sure things are sane):

Screen 0: minimum 640 x 480, current 640 x 480, maximum 640 x 480
default connected 640x480+0+0 0mm x 0mm
   640x480       60.00*

(pre xrandr --newmode)

Screen 0: minimum 416 x 241, current 640 x 480, maximum 640 x 480
default connected 640x480+0+0 0mm x 0mm
   640x480       60.00*
   416x241       59.66  

(new mode added successfully)

xrandr: Configure crtc 0 failed
crtc 0:      416x241  59.66 +0+0 "default"
crtc 0: disable
screen 0: revert
crtc 0: revert

Now this is the bit that's driving me nuts.  XRandR for some reason can't switch to the new mode.  MAME's verbose output confirms the mode remains in the 640x480 interlaced mode, and it has not switched over (with or without "switchres" makes no difference).

I'm using the nv_drv.so module (2.1.15), which I've hacked to allow low pclocks (default was 12, and I've dropped mine down to 6MHz).

xrandr has been giving me this issue previously before I used your hacked version too.

If I set a modeline manually in /etc/X11/xorg.conf, I can get the resolution fine (although the screen tearing is there due to the rounded vertrefresh issues above).  If I can get xrandr to switch resolutions, I'm home free.  This is the last hurdle for me before I can get this working properly.

Any advice?
« Last Edit: October 21, 2010, 12:00:59 am by elvis »

bitbytebit

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 896
    • The Groovy Organization
I have one lingering problem.  I don't think it's related to your switchres program, but something else.

I have a default 640x480@60Hz interlaced modeline in Xorg (if I have no modeline, Xorg won't start).  Running Ubuntu 10.04.

Whether I run switchres, or put the following in .xinitrc manually:

xrandr --newmode "416x241"  7812000 416 424 464 496 241 245 248 264 -HSync -VSync
xrandr --addmode default "416x241"
xrandr --verbose --output default --mode "416x241"
mame -verbose -resolution 416x241x32@59.66 mvsc

I get the following output in my log file (with a few extra "xrandr -q" statements to make sure things are sane):

Screen 0: minimum 640 x 480, current 640 x 480, maximum 640 x 480
default connected 640x480+0+0 0mm x 0mm
   640x480       60.00*

(pre xrandr --newmode)

Screen 0: minimum 416 x 241, current 640 x 480, maximum 640 x 480
default connected 640x480+0+0 0mm x 0mm
   640x480       60.00*
   416x241       59.66  

(new mode added successfully)

xrandr: Configure crtc 0 failed
crtc 0:      416x241  59.66 +0+0 "default"
crtc 0: disable
screen 0: revert
crtc 0: revert

Now this is the bit that's driving me nuts.  XRandR for some reason can't switch to the new mode.  MAME's verbose output confirms the mode remains in the 640x480 interlaced mode, and it has not switched over (with or without "switchres" makes no difference).

I'm using the nv_drv.so module (2.1.15), which I've hacked to allow low pclocks (default was 12, and I've dropped mine down to 6MHz).

xrandr has been giving me this issue previously before I used your hacked version too.

If I set a modeline manually in /etc/X11/xorg.conf, I can get the resolution fine (although the screen tearing is there due to the rounded vertrefresh issues above).  If I can get xrandr to switch resolutions, I'm home free.  This is the last hurdle for me before I can get this working properly.

And advice?

Try the newest nvidia driver, hack it for low pclocks and see if it fixes things, may also need a newer xorg-server but hopefully not because they are a pain to update all the many parts.  What version is your xorg-server?  Mine is 1.7.7, but not sure about what versions work best with xrandr.  It seems that the xrandr source indicates the change in resolution isn't being accepted and I assume that it's talking to the nvidia driver.  Possibly the xrandr support in there has improved recently, seems they have done some changes to it in the current git logs but not sure if those are really fixing this issue.
http://cgit.freedesktop.org/xorg/driver/xf86-video-nv/
SwitchRes / GroovyMame - http://arcade.groovy.org
Modeline Generator and Mame Wrapper for Windows or Linux
LiveCD of Groovy Arcade Linux for Arcade Monitors
GroovyMame - generate arcade resolutions like advancemame
--
The Groovy Organization

elvis

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 1129
  • penguin poker
    • StickFreaks
What version is your xorg-server?  Mine is 1.7.7

root@lowboy:~# dpkg -l xorg xserver-xorg-core
ii  xorg                      1:7.5+5ubuntu1            X.Org X Window System
ii  xserver-xorg-core         2:1.7.6-2ubuntu7.3        Xorg X server - core server

1.7.6 according to that.

The supplied version of nv_drv was 2.1.15.  I've just tried 2.1.18 (latest stable release, dated July 2010) and it gave the same error.   I tried to build from git, but it said a required tool (specifically xorg-macros) was too old.

Next stop might be dist-upgrading to Maverick (10.10) and trying again.  According to packages.ubuntu.com it uses xserver-xorg-core 1.9.0.

bitbytebit

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 896
    • The Groovy Organization
What version is your xorg-server?  Mine is 1.7.7

root@lowboy:~# dpkg -l xorg xserver-xorg-core
ii  xorg                      1:7.5+5ubuntu1            X.Org X Window System
ii  xserver-xorg-core         2:1.7.6-2ubuntu7.3        Xorg X server - core server

1.7.6 according to that.

The supplied version of nv_drv was 2.1.15.  I've just tried 2.1.18 (latest stable release, dated July 2010) and it gave the same error.   I tried to build from git, but it said a required tool (specifically xorg-macros) was too old.

Next stop might be dist-upgrading to Maverick (10.10) and trying again.  According to packages.ubuntu.com it uses xserver-xorg-core 1.9.0.
Yeah I can't see any direct xrandr support in the drivers, looks like it's talking to the xorg-server directly and it calls the drivers get_modelines function from there but the xrandr stuff is completely handled within the xorg-server I guess.  Hopefully it's just that they didn't have the xrandr working as well in that version as they did in 1.7.7, I do think possibly my version of xorg-server with the default gentoo install didn't work till I updated to the vanilla 1.7.7 one.  Looks nice, getting version 1.9.0, would be interested if that fixes it.  I really want to try the newest version but not sure when Gentoo is going to move to that high of version, and I've spent a lot of time building this box with gentoo but it's tempting seeing ubuntu is using the newest one.
SwitchRes / GroovyMame - http://arcade.groovy.org
Modeline Generator and Mame Wrapper for Windows or Linux
LiveCD of Groovy Arcade Linux for Arcade Monitors
GroovyMame - generate arcade resolutions like advancemame
--
The Groovy Organization

elvis

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 1129
  • penguin poker
    • StickFreaks
Upgrading to Maverick as I type.

There's also the "nouveau" drivers now too, which do support my TNT2 (and apparently work fine in 2D/"soft").  However it's using this new fangled kernel modeline stuff, so I'm not sure where to hack the minimum pclocks (or even if I can!).  But that will be my next step if Maverick doesn't play ball.

[edit]

Maverick upgrade is in.  Same issues as before with the nv_drv.so (even the latest from git).  It definitely looks like that driver can't understand XRandR requests.

nouveau certainly understands them (I can see the resize requests coming through in the log file), but the pixel clocks limits must be too high because it's throwing errors.

So, now I'm off to hack the nouveau drivers (and hopefully just the userspace stuff, and not kernel stuff too).

Maybe it's time to order an old ATI card off eBay? :)
« Last Edit: October 21, 2010, 12:38:43 am by elvis »

Calamity

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 5469
Guessing from reading this, that you could probably trick the radeon driver in Windows to take the lrmc generated modeline of my new switchres script and use your method in Windows instead of xrandr in Linux.  This sounds quite interesting, do you think other Windows graphics drivers can be tricked this way, or have they, how exactly would I need to go about adapting switchres to do this when run in windows?  From the amazing functionality I'm having in Linux with switchres now, being able to generate modelines dynamically and get pretty good modelines and not rely on .ini files, seems this would be great now combined with your ability to push modlines into the radeon driver in Windows.  If only this could be done for even more video cards in Windows too, then we'd really have something extra interesting.  If we can make switchres work exactly the same in both Windows and Linux, that would be great, and I think at least with the radeon driver this seems very possible.

Hi bitbytebit,

I keep following this thread, it's getting really interesting, though I've just become a father this week and it's really hard to catch up  ;D

Now that you bring that post back about the 'loader' thing I had in mind, I see that it's exactly what you have succeeded to implement for Linux, it's fantastic. Think of the endless possibilities of using dynamic modelines, for instance you can write a new 'advv' clone to allow you to find your monitor ranges, centering/tweaking modes, and use the results as feed back for lrmc so it can create even better modelines for your hardware. This is why I made this Arcade_OSD program, to test this functionality, though it will only work with my hacked Catalyst (CRT_Emudriver). I understand you've used a video mode DB to get rid of inis, good.

The same scheme would work for Windows, that's sure, but it's complicated to make a general method for all cards, as I'll explain. Windows video drivers just parse the registry for custom modelines at startup. That's why you need to restart the system all the time to test changes... annoying. If only we could reset the driver by unloading and reloading it so it went through it's initialization routines again and read the registry keys after we modify them, there would be a chance to get it. Accidentaly, there is a documented way of doing this! Here is it:

http://msdn.microsoft.com/en-us/library/ff568527%28VS.85%29.aspx

Basically, it works by setting 640x480x16 colors (4 bits) and inmediately restoring the original mode. This is because 640x480x16 is usually implemented by Windows default video driver, thus by calling it the specific video driver is unloaded from memory. To get it working for me, after that, I need to request Windows for available video modes. It's stable, works really well and is reasonably fast. Unfortunately, I only got it working with my hacked Catalyst 6.5, and (very strange) just when the amount of defined modelines is big enough (I still have to find the reason for this). No luck with Catalyst 8.x in my office computer, nor with ForceWare in my laptop, but I haven't tested it so much. I believe, as the article says, it's because these drivers have native support for 640x480x16 colors, so they never get unloaded :( However, it's a matter of testing and investigating it, I unfortunately have so little time to do this, I hope someone will use this stuff to do it.

There's a limitation to this method: you can modify existing modes, but not create new ones on the fly (you need restart). The reason, I believe, is that Windows internally only requests the driver for available video modes during startup sequence, so it won't modify it's internal mode table until we restart. But this limitation can be easily overcome by preparing a general mode table of needed resolutions (no vfreq defined) and tweaking the chosen modeline before calling the emulator, following the loader-wrapper scheme.

At this point, I'd really would consider to have a look at lrmc method for calculating modelines. It's funny because I wrote VMMaker from scratch figuring out all calculations and for me, lrmc is still a black box, if only I had more time to study it. I'm convinced the way I use in VMMaker is better. However, this is a secondary matter.

Definitely, your D9800 is a fantastic monitor, it's incredible it has a continuous range. But it's hard for me to imagine how this works from the hardware part. At the end of the day the intervals must exist, because porchs and sync pulses need to be smaller as hfreq increases, and there should be jumps somewhere (maybe that's why you experiment centering shifts at some points). It seems to work as an automatic car, you just have to put your foot on the accelarator, but the car does change gears inside.

I'm also concerned about the vsync stuff in Linux. Now that I've tested SDLMame for Windows, which I believe is the same code for Linux, I think that you should be able to turn throttle off as I do, and if it gets full speed is because vsync is not really working. This also happened to me if I used 'video software' instead of opengl. Think that vsync is a must, because even if you can get really accurate vfreqs (you'll normally be 2 cents of Hz above or below in the best cases), if throttle is on, Mame will keep its internal clock on, and it will produce regular hiccups in scrolls. We don't want Mame to do that, we want it to hang off our vfreq to be as smooth as possible.

There's a lot I should check, fist of all I'm not sure what version of SDL I have, or if SDLMame is using SDL in any way, what's the role of opengl in all this, so I need to clarify some concepts for me. Also, how to run perl scripts in Windows, etc.
CRT Emudriver, VMMaker & Arcade OSD downloads, documentation and discussion:  Eiusdemmodi

bitbytebit

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 896
    • The Groovy Organization
Guessing from reading this, that you could probably trick the radeon driver in Windows to take the lrmc generated modeline of my new switchres script and use your method in Windows instead of xrandr in Linux.  This sounds quite interesting, do you think other Windows graphics drivers can be tricked this way, or have they, how exactly would I need to go about adapting switchres to do this when run in windows?  From the amazing functionality I'm having in Linux with switchres now, being able to generate modelines dynamically and get pretty good modelines and not rely on .ini files, seems this would be great now combined with your ability to push modlines into the radeon driver in Windows.  If only this could be done for even more video cards in Windows too, then we'd really have something extra interesting.  If we can make switchres work exactly the same in both Windows and Linux, that would be great, and I think at least with the radeon driver this seems very possible.

Hi bitbytebit,

I keep following this thread, it's getting really interesting, though I've just become a father this week and it's really hard to catch up  ;D

Now that you bring that post back about the 'loader' thing I had in mind, I see that it's exactly what you have succeeded to implement for Linux, it's fantastic. Think of the endless possibilities of using dynamic modelines, for instance you can write a new 'advv' clone to allow you to find your monitor ranges, centering/tweaking modes, and use the results as feed back for lrmc so it can create even better modelines for your hardware. This is why I made this Arcade_OSD program, to test this functionality, though it will only work with my hacked Catalyst (CRT_Emudriver). I understand you've used a video mode DB to get rid of inis, good.

The same scheme would work for Windows, that's sure, but it's complicated to make a general method for all cards, as I'll explain. Windows video drivers just parse the registry for custom modelines at startup. That's why you need to restart the system all the time to test changes... annoying. If only we could reset the driver by unloading and reloading it so it went through it's initialization routines again and read the registry keys after we modify them, there would be a chance to get it. Accidentaly, there is a documented way of doing this! Here is it:

http://msdn.microsoft.com/en-us/library/ff568527%28VS.85%29.aspx

Basically, it works by setting 640x480x16 colors (4 bits) and inmediately restoring the original mode. This is because 640x480x16 is usually implemented by Windows default video driver, thus by calling it the specific video driver is unloaded from memory. To get it working for me, after that, I need to request Windows for available video modes. It's stable, works really well and is reasonably fast. Unfortunately, I only got it working with my hacked Catalyst 6.5, and (very strange) just when the amount of defined modelines is big enough (I still have to find the reason for this). No luck with Catalyst 8.x in my office computer, nor with ForceWare in my laptop, but I haven't tested it so much. I believe, as the article says, it's because these drivers have native support for 640x480x16 colors, so they never get unloaded :( However, it's a matter of testing and investigating it, I unfortunately have so little time to do this, I hope someone will use this stuff to do it.

There's a limitation to this method: you can modify existing modes, but not create new ones on the fly (you need restart). The reason, I believe, is that Windows internally only requests the driver for available video modes during startup sequence, so it won't modify it's internal mode table until we restart. But this limitation can be easily overcome by preparing a general mode table of needed resolutions (no vfreq defined) and tweaking the chosen modeline before calling the emulator, following the loader-wrapper scheme.

At this point, I'd really would consider to have a look at lrmc method for calculating modelines. It's funny because I wrote VMMaker from scratch figuring out all calculations and for me, lrmc is still a black box, if only I had more time to study it. I'm convinced the way I use in VMMaker is better. However, this is a secondary matter.

Definitely, your D9800 is a fantastic monitor, it's incredible it has a continuous range. But it's hard for me to imagine how this works from the hardware part. At the end of the day the intervals must exist, because porchs and sync pulses need to be smaller as hfreq increases, and there should be jumps somewhere (maybe that's why you experiment centering shifts at some points). It seems to work as an automatic car, you just have to put your foot on the accelarator, but the car does change gears inside.

I'm also concerned about the vsync stuff in Linux. Now that I've tested SDLMame for Windows, which I believe is the same code for Linux, I think that you should be able to turn throttle off as I do, and if it gets full speed is because vsync is not really working. This also happened to me if I used 'video software' instead of opengl. Think that vsync is a must, because even if you can get really accurate vfreqs (you'll normally be 2 cents of Hz above or below in the best cases), if throttle is on, Mame will keep its internal clock on, and it will produce regular hiccups in scrolls. We don't want Mame to do that, we want it to hang off our vfreq to be as smooth as possible.

There's a lot I should check, fist of all I'm not sure what version of SDL I have, or if SDLMame is using SDL in any way, what's the role of opengl in all this, so I need to clarify some concepts for me. Also, how to run perl scripts in Windows, etc.


Congratulations on becoming a father :)

Yeah it's nice now without using ini's, not really a video DB but actually just quickly doing a mame -listdev game and grabbing the display section.  I've been able to start testing multiple monitor types much easier just changing the command line to cga or ega, so has been interesting to compare each ones limitations and what happens when it's more restricted.  That windows stuff sounds promising, glad there is a possible way to somewhat do it there.  I suspect your methods are probably better than lrmc, I can see what your saying about the gear changing as you climb the Horz Freq up higher.  I think there needs to be some sort of dynamic altering of the porches as it gets higher values instead of what it does now with multiple static display sets with values for each.  I've been finding basically as it moves higher in Freq the divisor it is using has to be larger for each of the porches.  I'm still learning how that works, but from what I can tell if it dynamically generated the display values for a d9800 depending on what you asked it in screen size and refresh rates then it could very likely always center the screen.  I actually have it pretty much doing this for most games with my current set which has chunks through like 15.250-17.499, 17.500-20.000,20.001-23.899,23.900-25.500,27-30.99,31-32,32.001-40   that has some places in there which need some perfecting but it's more for really odd games and most really are centered and full without boarders with the right refresh rate.  So if that was able to be adjusted dynamically without all those preset areas possibly then the oddball games would work better.  I've been running into one issue with a few games that are widescreen it seems like the DBZ game which seems to be a 1.5 aspect ratio and I have some odd recalculating of the framesize which kind of makes it work but haven't been satisfied with any method to automatically calculate any games that go above the 1.3333333 normal monitor aspect ratio. 

I found something odd about the waitvsync setting though, I'm starting to think it's my video card can't handle it properly or the X drivers for my video card/opengl implementation.  I got the vsync working on another computer using an nvidia quadro fx 3400, my arcade system uses the newer arcadeVGA3000 which is just a radeon 2600HD I guess and has the RV630 on it.  When I start mame on the radeon it always says OpenGL: FBO not supported and also about IRQ's not enabled falling back to busy waits: 2 0.  I am guessing this is the problem, that the card isn't able to do interrupts with opengl so it can't do vsync stuff through opengl correctly.  I'm thinking of using an nvidia quadro, testing if it can do the low pixclocks which I'm guessing it can?  It's sad if that's really true about the arcadeVGA actually not supporting waitforvsync in Linux, when it's supposed to be specifically for arcade systems, and yet this older nvidia card can do it just fine.  Although that's also with the proprietary nvidia drivers so I'm not sure if it's the same for the open ones which I'd have to use for the lower dotclock stuff.  Am now planning on possibly testing these nvidia cards and seeing if they are at all better than the radeon, although it just doesn't seem to make sense that they would be but if the vsync to refresh rates works on them in Linux and dotclocks can go low then I guess they pretty much are more suited for this.

In windows just getting active perl, it installs from the http://www.activestate.com/activeperl website pretty easy and then you can run perl scripts in windows just like any other program there.
SwitchRes / GroovyMame - http://arcade.groovy.org
Modeline Generator and Mame Wrapper for Windows or Linux
LiveCD of Groovy Arcade Linux for Arcade Monitors
GroovyMame - generate arcade resolutions like advancemame
--
The Groovy Organization

Calamity

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 5469
I found something odd about the waitvsync setting though, I'm starting to think it's my video card can't handle it properly or the X drivers for my video card/opengl implementation.  I got the vsync working on another computer using an nvidia quadro fx 3400, my arcade system uses the newer arcadeVGA3000 which is just a radeon 2600HD I guess and has the RV630 on it. 

I'm afraid the problem is with the Radeon driver not implementing waitvsync properly, as the folks in the Spanish forum thought, as with nVidia they were able to make waitvsync work without problem.

Regarding porches and sync pulses, I am positive they keep constant through the range covered by my monitor: 15.625 - 16.670 KHz, so in order to keep my modes centered I have to use the same values all the time. This stuff is related to the speed of the electron beam, which is constant in my monitor regardless of hfreq. There's a visible effect due to this: when you increase hfreq, the picture becomes narrower in width, as the extra lines, being the beam speed constant, need to come from somewhere: the sides of the picture! Your monitor must have more complex electronics, to be able to program the electrom beam at different speeds according to some previously defined hfreq intervals (you would notice the explained narrowing effect somewhere), or... have a really progresive beam speed, which would allow to interpolate the porch values for the whole range.
CRT Emudriver, VMMaker & Arcade OSD downloads, documentation and discussion:  Eiusdemmodi

bitbytebit

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 896
    • The Groovy Organization
I found something odd about the waitvsync setting though, I'm starting to think it's my video card can't handle it properly or the X drivers for my video card/opengl implementation.  I got the vsync working on another computer using an nvidia quadro fx 3400, my arcade system uses the newer arcadeVGA3000 which is just a radeon 2600HD I guess and has the RV630 on it. 

I'm afraid the problem is with the Radeon driver not implementing waitvsync properly, as the folks in the Spanish forum thought, as with nVidia they were able to make waitvsync work without problem.

Regarding porches and sync pulses, I am positive they keep constant through the range covered by my monitor: 15.625 - 16.670 KHz, so in order to keep my modes centered I have to use the same values all the time. This stuff is related to the speed of the electron beam, which is constant in my monitor regardless of hfreq. There's a visible effect due to this: when you increase hfreq, the picture becomes narrower in width, as the extra lines, being the beam speed constant, need to come from somewhere: the sides of the picture! Your monitor must have more complex electronics, to be able to program the electrom beam at different speeds according to some previously defined hfreq intervals (you would notice the explained narrowing effect somewhere), or... have a really progresive beam speed, which would allow to interpolate the porch values for the whole range.


Interesting, good to know and now I guess I'm going to try my nvidia cards, sounds like I need that for vsync and sounds great to finally get it working right with all these resolutions having the right refresh rate. 

I'm currently realizing I can now do all the odd calculations that lrmc was doing, like the vertical game resizing or dual screen adjustment, inside switchres now.  That is going to make it a lot simpler doing all the pre-work inside the perl script and just let lrmc take a resolution and refresh rate and expect it to not do any fiddling with it other than perfecting the modeline.  This makes it much easier to have dual monitor support, which switchres should basically have now with some changes I just made, and testing.  I had thought of the advv utility ability actually before, and have it right now just bringing up xvidtune in -testmode, but we need a better program than xvidtune I think.  Seems that xvidtune is under the same spell the X drivers have gotten into where it doesn't listen to custom modelines very easily, or maybe it's just ignoring the xrandr modelines possibly.  At any rate it basically will only come up with the modeline in the config xorg.conf and even then won't allow altering it claiming it is out of range unless it's a VESA compliant modline. 

I am really interested in this aspect ratio issue though, because from what I can tell it's calculated different by everybody, some using 1.22222 some 1.333333 and lrmc was doing some odd division of the width by .56 on vertical games.  Some games you have to leave alone, some need some tweaking (like dbz or virtua racer, mortal kombat).  Seems it may be a range of vertical resolutions that have either larger than 1.3 or smaller aspect ratios, but never below 1.25.  I'm not sure if it's something I'm doing, but from what I can tell I can get dbz at the exact original resolution and refresh but it's off the sides and I have to do some odd calculations to make it be in a letter box type screen and it fits then.  Same with Virtua Racer but opposite, It's a smaller than 1.3, like 1.28 aspect ratio and It needs a slight different screen size I have to calculate weirdly.  I have found algorithms that seem to work with these games mostly consistent but trying to see the deeper logic in it and if it really translates to all games and/or what ranges.  So far if the vertical size is above 224 and below 400 and the aspect is above 1.3 I do one thing, while if below 1.3 but above 1.25 I do another.  Seems to mostly work, and possibly only other unique thing is that maybe the ones above 1.3 only qualify for my change if they are below 56 Hz vertical refresh rates.  So that's something I've been trying to wrap my brain around the last day or so,  seems like it's a pixel aspect ratio vs. screen aspect ratio thing and not sure if lrmc might be a culprit and I'm fighting it or this information isn't in mame for the games and makes it hard to get exact display output that matches all parameters and fits perfect/refreshes perfectly.  This of course isn't practical for normal arcade monitors to do all the games that way, but for the d9800 this seems to be an issue because you can actually match most and so it starts to get somewhat tricky because unlike the AVGA card or in Windows you don't just have a set of 30 modelines and know the games are going to have boarders sometimes and so leave extra which seems to make up for these odd aspect ratio games.
SwitchRes / GroovyMame - http://arcade.groovy.org
Modeline Generator and Mame Wrapper for Windows or Linux
LiveCD of Groovy Arcade Linux for Arcade Monitors
GroovyMame - generate arcade resolutions like advancemame
--
The Groovy Organization

Calamity

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 5469
You may have already checked this, but just in case:

http://www.bannister.org/forums/ubbthreads.php?ubb=showflat&Number=46353&page=1

I'm currently realizing I can now do all the odd calculations that lrmc was doing, like the vertical game resizing or dual screen adjustment, inside switchres now.  That is going to make it a lot simpler doing all the pre-work inside the perl script and just let lrmc take a resolution and refresh rate and expect it to not do any fiddling with it other than perfecting the modeline.

Definitely YES, I recommend you to keep all the mode adaptation stuff in your script, and just pass lrmc a plain modeline to calculate it - as your project progresses, you might get rid of lrmc at all ;)

I am really interested in this aspect ratio issue though, because from what I can tell it's calculated different by everybody, some using 1.22222 some 1.333333 and lrmc was doing some odd division of the width by .56 on vertical games.  Some games you have to leave alone, some need some tweaking (like dbz or virtua racer, mortal kombat).  Seems it may be a range of vertical resolutions that have either larger than 1.3 or smaller aspect ratios, but never below 1.25.  I'm not sure if it's something I'm doing, but from what I can tell I can get dbz at the exact original resolution and refresh but it's off the sides and I have to do some odd calculations to make it be in a letter box type screen and it fits then.  Same with Virtua Racer but opposite, It's a smaller than 1.3, like 1.28 aspect ratio and It needs a slight different screen size I have to calculate weirdly.  I have found algorithms that seem to work with these games mostly consistent but trying to see the deeper logic in it and if it really translates to all games and/or what ranges.  So far if the vertical size is above 224 and below 400 and the aspect is above 1.3 I do one thing, while if below 1.3 but above 1.25 I do another.  Seems to mostly work, and possibly only other unique thing is that maybe the ones above 1.3 only qualify for my change if they are below 56 Hz vertical refresh rates.  So that's something I've been trying to wrap my brain around the last day or so,  seems like it's a pixel aspect ratio vs. screen aspect ratio thing and not sure if lrmc might be a culprit and I'm fighting it or this information isn't in mame for the games and makes it hard to get exact display output that matches all parameters and fits perfect/refreshes perfectly.  This of course isn't practical for normal arcade monitors to do all the games that way, but for the d9800 this seems to be an issue because you can actually match most and so it starts to get somewhat tricky because unlike the AVGA card or in Windows you don't just have a set of 30 modelines and know the games are going to have boarders sometimes and so leave extra which seems to make up for these odd aspect ratio games.

I am not sure if I follow you here, but... I wouldn't care about pixel aspect at all. Think that arcade games picture was stretched using monitor potenciometers to cover the 4:3 full screen, regardless they had 192, 224, 240, 256 or whatever vertical resolution, so speaking of pixel aspect there doesn't make sense, unless you intend to add artificial side bordes to keep the picture 4:3 without adjusting monitor controls (something I would understand if you use a TV with no service mode, but not with an arcade monitor). So if DBZ asks you for 416x224, just make that exact resolution, then adjust your monitor's v-amp to cover the screen, it works for me this way. For vertical games it's a different story.
CRT Emudriver, VMMaker & Arcade OSD downloads, documentation and discussion:  Eiusdemmodi

Gray_Area

  • -Banned-
  • Trade Count: (+1)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 3363
  • -Banned-
AdvanceMAME did dynamic modeline generation at game start-up, and was initially Linux-based. Maybe you want to look at the source?

http://advancemame.sourceforge.net/
-Banned-

bitbytebit

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 896
    • The Groovy Organization
AdvanceMAME did dynamic modeline generation at game start-up, and was initially Linux-based. Maybe you want to look at the source?

http://advancemame.sourceforge.net/

Thanks for reminding me, I actually have wanted to do that and tried awhile back but now should take another look.  I have learned a lot more since I last tried and suspect I can spot the stuff I need much easier now.  Definitely seems like what this is turning into is a wrapper for the normal mame/mess or other emulators that can give them the ability that advancemame has/had when it was up to date.  Hopefully can be a little less dependent on hardware, but I am starting to see how that is really a brick wall with certain cards with either the physical chips or the knowledge of how to get the chips to do what we need. 
SwitchRes / GroovyMame - http://arcade.groovy.org
Modeline Generator and Mame Wrapper for Windows or Linux
LiveCD of Groovy Arcade Linux for Arcade Monitors
GroovyMame - generate arcade resolutions like advancemame
--
The Groovy Organization

bitbytebit

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 896
    • The Groovy Organization
You may have already checked this, but just in case:

http://www.bannister.org/forums/ubbthreads.php?ubb=showflat&Number=46353&page=1

I'm currently realizing I can now do all the odd calculations that lrmc was doing, like the vertical game resizing or dual screen adjustment, inside switchres now.  That is going to make it a lot simpler doing all the pre-work inside the perl script and just let lrmc take a resolution and refresh rate and expect it to not do any fiddling with it other than perfecting the modeline.

Definitely YES, I recommend you to keep all the mode adaptation stuff in your script, and just pass lrmc a plain modeline to calculate it - as your project progresses, you might get rid of lrmc at all ;)

I am really interested in this aspect ratio issue though, because from what I can tell it's calculated different by everybody, some using 1.22222 some 1.333333 and lrmc was doing some odd division of the width by .56 on vertical games.  Some games you have to leave alone, some need some tweaking (like dbz or virtua racer, mortal kombat).  Seems it may be a range of vertical resolutions that have either larger than 1.3 or smaller aspect ratios, but never below 1.25.  I'm not sure if it's something I'm doing, but from what I can tell I can get dbz at the exact original resolution and refresh but it's off the sides and I have to do some odd calculations to make it be in a letter box type screen and it fits then.  Same with Virtua Racer but opposite, It's a smaller than 1.3, like 1.28 aspect ratio and It needs a slight different screen size I have to calculate weirdly.  I have found algorithms that seem to work with these games mostly consistent but trying to see the deeper logic in it and if it really translates to all games and/or what ranges.  So far if the vertical size is above 224 and below 400 and the aspect is above 1.3 I do one thing, while if below 1.3 but above 1.25 I do another.  Seems to mostly work, and possibly only other unique thing is that maybe the ones above 1.3 only qualify for my change if they are below 56 Hz vertical refresh rates.  So that's something I've been trying to wrap my brain around the last day or so,  seems like it's a pixel aspect ratio vs. screen aspect ratio thing and not sure if lrmc might be a culprit and I'm fighting it or this information isn't in mame for the games and makes it hard to get exact display output that matches all parameters and fits perfect/refreshes perfectly.  This of course isn't practical for normal arcade monitors to do all the games that way, but for the d9800 this seems to be an issue because you can actually match most and so it starts to get somewhat tricky because unlike the AVGA card or in Windows you don't just have a set of 30 modelines and know the games are going to have boarders sometimes and so leave extra which seems to make up for these odd aspect ratio games.

I am not sure if I follow you here, but... I wouldn't care about pixel aspect at all. Think that arcade games picture was stretched using monitor potenciometers to cover the 4:3 full screen, regardless they had 192, 224, 240, 256 or whatever vertical resolution, so speaking of pixel aspect there doesn't make sense, unless you intend to add artificial side bordes to keep the picture 4:3 without adjusting monitor controls (something I would understand if you use a TV with no service mode, but not with an arcade monitor). So if DBZ asks you for 416x224, just make that exact resolution, then adjust your monitor's v-amp to cover the screen, it works for me this way. For vertical games it's a different story.


Yeah that is a thought I had too, it's definitely going to be possible eventually to either avoid lrmc, and we could also support any modeline generator too.  I like the idea of trying to get the modeline generation put into perl code so it's easy to modify and more people could alter it easier.  

Yeah I'd seen that thread about the radeon directfb support, odd thing is that it seems this ArcadeVGA 3 radeon isn't compatible with the linux kernels radeonfb framebuffer because the RV630 isn't supported I guess.  It doesn't allow the KMS stuff in newers kernels either, because it doesn't know the atom bios signature.  I assume it's because it's modified, it isn't really the same as those other radeon cards to the linux code.  

This pixel aspect thing is also odd because the way I figured out the formula I'm using for virtua racer was to look at what the -verbose output of mame for the GL shader was saying about the pitch values I guess.  It came up with 520x400 I think or something instead of the 496x384 it is said to be normally.  So I came up with that value through doing some calculating and seeing it was the difference of the 1.3 aspect ratio from the 1.29 reported times the height, and then the game aspect ratio of 1.29 times that new height.  Somehow that calculates it out and I think it works on other games, for dbz I can do a similar calculation and it works for the mortal kombat too and possibly others.  I'm still testing, figuring, but here's the basic formula which for some reason seems to mostly work... (Game aspect vs. monitor aspect as 1.3333333)
Code: [Select]
               } elsif ($GASPECT > $MASPECT) {
                        my $ADIFF = $GASPECT - $MASPECT;
                        $o_width = $o_width + ($o_height / 3);
                        $o_height = $o_width / $MASPECT;
                        $o_width = sprintf("%.0f", $o_width);
                        $o_height = sprintf("%.0f", $o_height);
                } elsif ($GASPECT >= 1.25 && $GASPECT < $MASPECT) {
                        my $ADIFF = $MASPECT - $GASPECT;
                        $o_height = $o_height + ($ADIFF * $o_height);
                        $o_width = $GASPECT * $o_height;
                        $o_width = sprintf("%.0f", $o_width);
                        $o_height = sprintf("%.0f", $o_height);
                }


I really hope it's something else, since it does seem odd to have to deal with the aspect ratio at all, but so far seems like this really is necessary for some reason with the modelines.  Actually I'm surprised that it seems to be working somewhat for the most part, and making these games look good now, I didn't think it'd actually be able to universally work but so far it might just be doing that with the code above.  I don't totally understand why dividing by 3 for that one calculation works, but that was the way to get the right value and it seems to be working also on other games too.  I think it might be something about the 4/3 ratio, but I just saw that it seemed to make the height/width come out to display the game without side boarders and not stretched at all vertically.  I also have keepaspect at 0 and unevenstretch at 0 too, which of course with those the screen will fit any resolution but alter things, another odd thing I've seen with the arcadeVGA sight instructions where they tell you to turn those on but I can't see how that would be a good idea unless the game is just too big and you have to fit it into a smaller resolution.

Edit:

Yeah I now see issues with doing this, hmmm, seems like it may be a side effect of lrmc possibly causing the issue with some games that I am trying to fix through this.  Will have to look at what it's doing and see.
« Last Edit: October 21, 2010, 02:52:32 pm by bitbytebit »
SwitchRes / GroovyMame - http://arcade.groovy.org
Modeline Generator and Mame Wrapper for Windows or Linux
LiveCD of Groovy Arcade Linux for Arcade Monitors
GroovyMame - generate arcade resolutions like advancemame
--
The Groovy Organization

  
 

Sitemap 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31