Software Support > GroovyMAME

Switchres: modeline generator engine

<< < (13/260) > >>

bitbytebit:

--- Quote from: Calamity on October 17, 2010, 05:53:53 pm ---
--- Quote from: bitbytebit on October 16, 2010, 10:27:55 pm ---Is using SDL on Windows a bad thing compared to the other options there?  Would think that SDL 1.3 would fix both Linux and Windows, if it also works well in Windows that is.  Possibly just adding some type of fixes to be like triplebuffer or proper vsync.  I also have read that the vsync issues are really an X windows one and they have indicated in a future release they will have some option for vsync with the X server.

--- End quote ---

I'm right now testing Win32 SDLMame in my laptop, after some tweaking it's running perfectly smooth and vsyncronized, the right settings in Windows seem to be: video opengl, keepaspect 0, unevenstretch 0, waitvsync 1, throttle 0, switchres 1. If 'video soft' is used, I can't get vsync working. I still have to test it using custom video modes in my cabinet, but it's quite probable it can replicate DirectX functionallity managing modes. However, there's still this inherent limitation of integer values for refresh rates, imposed by the system and drivers inner calls. In Windows, any API we use for going fullscreen will be conditioned by that.

From Windows point of mind, Linux way of using xrandr method seems strange, it looks like resizing the desktop before maximizing the window, affecting all applications. Although we have Win32 APIs to resize the desktop resolution, they are not used in this context, DirectX api (and SDL I suppose) are used to switch video modes instead and use the screen in exclusive mode, so you can access the advanced video features. The method I suggested would just use plain SDL to go fullscreen, i.e. "320x224", but we'll make sure the proper refresh will be used by making an unique modeline available to the driver at the 320x224 resolution, so it should not fail... anyway this is just theory.


--- End quote ---

I have xrandr working great now, doing it the way your suggesting and dynamically creating the modelines when starting each game so it knows what it will get.  This seems to work great, and after fixing the radeon driver memory leak it's very fast and not unstable anymore.  It works better than anything I've seen yet and very good a getting the vertical refresh as close as possible depending on the monitor.  Still need to look into if SDL 1.3 really fixes Vsync when it's not exact, or what other options there are in Linux SDL.  I do see this option "-refreshspeed        automatically adjusts the speed of gameplay to keep the refresh rate lower than the screen" which looks like it might help with that when games aren't able to get a perfect refresh rate.  Also I have an xrandr patch, and it requires the newest GIT of xrandr, because older versions used integers for the dotclock modeline value and the newest doesn't but I modified it to really get it exact, was a slight bit of rounding decimal places still.  There's an xrandr.diff patch for that now, and the mame_xrandr.pl script takes the game and monitor type like "mame_xrandr.pl pacman -m d9800" or if no monitor is given it uses lrmc.conf.  It does take advancemame type clock lines supposedly to get the right blanking times, so that is something interesting rather than using example modelines.  The source seems to have that info in it, haven't seen that talked about anywhere else and using it kinda worked for me but either I had a bad line for it or it's a bit buggy.  I think in general the xrandr method is really a good way to go, avoids messy modelines and does exactly like you had thought in really forcing the exact resolution.  I'm thinking that building this into MAME might be interesting, but then again it also might really not get us much more than we have using a perl script.  Also you still have to create the .ini files, I'm thinking that's best because it's basically a database anyways and we are actually letting MAME do the modeswitching and just adding modelines with X to have mame find.  Also the .ini files have extra info for games that need keepaspect and unevenstretch when they either are at the default resolution or can't calculate a modeline which has a large enough vertical height for them.  This version really ought to be interesting because it brings in a lot more possibilities using xrandr, and also makes it a lot easier and might eventually allow us to drop needing any modified X servers/drivers (I'm not fully sure they are needed anymore, probably nicer for an arcade monitor since the modified ones prevent stray odd  resolutions from being generated, but then again most of those can be removed by just setting the Vsync/Hsync values correctly in xorg.conf.  Although with a WG 9800 since it has a big range X goes a bit crazy with the defaults).

Also another thing I can't figure out is how to get good console support, all the frame buffer drivers seem aimed at vesa from the bios and will not allow fully alterable modelines.  I think they actually are forcing the pixelclock to be a decimal or something, same as old xrandr versions did.  The uvesafb seems close in working, the radeonfb seems not to detect the newer ArcadeVGA cards since they have RV600 chips and the linux fb driver only goes up to the RV400 chips.  So my linux console is always a bit out of wack for even a d9800 and I'm sure a normal arcade monitor would just hate the mode it's in.  Not sure how this is done in the distributions exactly for arcade linux systems, but it's something I've been trying to figure out too.

So your theory looks pretty good following my xrandr perl script prototype/working model :).   I'm quite amazed actually, because I had been seeing unstableness with xrandr and the modelines weren't right but those ended up being the memory leak and the older xrandr versions.  Once fixing those two things, xrandr works great (plus figuring out how to exactly use the utility, the perl script definitely is interesting in doing all the dirty work for you to get xrandr to add/delete modelines).

Version .10 is up, and it hopefully is working well with xrandr, even decent with the sdl 1.2 hres fix although I can see what you mean about the screen getting pushed right more and more.  I did do something where I recalculate the modeline with the new values without having it align to 8 bytes.  I found a problem though in general with this method where you basically in mame get a chopped right side even on vertical games of the difference of 8 pixel alignment.  So I guess that makes the xrandr even better and necessary, otherwise you would have to move 8 pixels per modeline to really keep mame from chopping pixels.

bitbytebit:
Got confirmation from Wells Gardner about the D9800 Horizontal Freq ranges.  He says they are not fixed, no danger to the monitor, and basically it can do anywhere in the range of 15.250 - 40.00 Khz.  Pretty interesting, seems it isn't like the d9200 then where it sounds like it's fixed points for CGA/EGA/VGA areas.  That's good because I've been really getting close exact vertical refresh rates for most games now by allowing the Horiz Khz to go up to  19Khz, so with the d9800 it should be able to natively handle the vsync issues just by setting the modeline up right.  I need to figure out how to get a few of the modelines that get up closer to 20Khz from being skewed left a little, although it's mostly vertical games like pacman where it's not a big issue.  I seem to possibly need an extra displaymode in lrmc for the range right above CGA mode between that and EGA mode with different horizontal timing for the blanking and stuff.  I have a set where I changed the horizontal values and it seems to shift it back but I need to separate it and have it only used for those specific areas in 17-20Khz since when it's used for lower ones then it can skew them the opposite way.

bitbytebit:
Version 0.11   Now makes this a much simpler solution, and universal for all graphics cards, best for ones with low dotclock abilities. 
* no longer need any .ini files or modelines in the xorg.conf file, can dynamically generate modelines and put them into X with xrandr, removing them when done.
* now can run any emulator with the dynamic modelines and switchres wrapper.
* removed all the Xorg/Radeon stuff, just patches for each now since it's not really necessary to install/patch X anymore.
* big improvements to lrmc decisions in modeline creation, good general modeline creator and should be useful for Soft15Khz modeline creation or any other system.
* much easier to get to working immediately, simpler, just install the modified lrmc, install the newest xrandr wtih my patch applied and wrap MAME with switchres or MESS or any other emulator.
* should be much easier to eventually get this working in Windows too, just need to figure out ways to add custom modelines dynamically to the drivers and remove them I guess.  The lrmc
   program compiles under Windows and switchres is Perl so will run in Windows.  Not working yet, but it is a goal to eventually make this a universal dynamic modeline generator across all platforms
   for emulators to use.
* can attach zip file again to this message because it's now very small from not including all the extra X org stuff.

bitbytebit:

--- Quote from: Calamity on October 10, 2010, 07:56:09 am ---
--- Quote from: bitbytebit on October 09, 2010, 08:13:38 pm ---I've wondered about the games not all having correct information, definitely can see some that the info just doesn't seem right and it's hard to get the to display nicely.  Also the pixelclock values are there for some, but when I try to use that in my calculations which I can do in lrmc as a minimum, it just makes some games become too small on the screen.
--- End quote ---

Pixelclock values in mame.xml as well as porch values are precious information by their own, and also if we intended to replicate the exact original video signal, but I'm afraid this is not the way to go if we want to have hundreds of games simultaneously centered on the monitor, so we must redefine the pixelclock for each video mode resulting for the porchs (borders) we want to have, which should be as constant as possible among different video modes, to avoid having to be adjusting our monitor all the time (this is only possible with horizontal borders, vertical amplitud must be adjusted manually). At the same time we have to keep our vertical frequency as close as possible to the original.


--- Quote from: bitbytebit on October 09, 2010, 08:13:38 pm ---It's definitely fun to work on this, started this arcade cabinet as a project for myself and had to use Linux cause that's all I ever have used for anything, which has led me on a mission to share what I am doing for myself which I want to basically make this stuff work on Linux as good as Windows.  Oddly I'm surprised that it seems we probably already  have surpassed Windows in some ways because of the ability to really access the hardware directly through the X server, have unlimited modelines, and not have any doors permanently closed so pretty much should only be limited by what the hardware really can do and time it takes to program it.
--- End quote ---

Definitely that functionallity is not available in Windows. Well, I have managed to trick the Catalsyt video driver to "reset" itself on the fly so it will read the registry modelines without the need of rebooting, so I would eventually have unlimited modelines also in Windows, but I still have to figure out how to make this available for different emulators (I believe the only option is a sort of loader), and definitely your Linux method is much cleaner and straightforward.

My biggest concern with Linux and the X Radeon driver is if you can really achieve a proper Vsync functionallity. I have heard there were problems with these cards and vsync, I hope it's been solved. Some folks had problems with this in the Spanish forum:

http://www.retrovicio.com/foro/showthread.php?t=8250&highlight=investigaci%C3%B3n

Thanks for the good work!





--- End quote ---

Guessing from reading this, that you could probably trick the radeon driver in Windows to take the lrmc generated modeline of my new switchres script and use your method in Windows instead of xrandr in Linux.  This sounds quite interesting, do you think other Windows graphics drivers can be tricked this way, or have they, how exactly would I need to go about adapting switchres to do this when run in windows?  From the amazing functionality I'm having in Linux with switchres now, being able to generate modelines dynamically and get pretty good modelines and not rely on .ini files, seems this would be great now combined with your ability to push modlines into the radeon driver in Windows.  If only this could be done for even more video cards in Windows too, then we'd really have something extra interesting.  If we can make switchres work exactly the same in both Windows and Linux, that would be great, and I think at least with the radeon driver this seems very possible.

elvis:
I'm having troubles patching XRandR with the 0.11 download above.

I do:

# unzip GenRes-0.11.zip
# cd GenRes-0.11
# git clone git://anongit.freedesktop.org/xorg/app/xrandr
# cd xrandr
# git apply --stat ../patches/xrandr.diff
 xrandr.c |   18 ++++++++++--------
 1 files changed, 10 insertions(+), 8 deletions(-)
# git apply --check ../patches/xrandr.diff
error: patch failed: xrandr.c:1426
error: xrandr.c: patch does not apply

Have I done something wrong?

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version