The NEW Build Your Own Arcade Controls

Software Support => GroovyMAME => Topic started by: philexile on December 21, 2018, 10:57:34 pm

Title: Most Powerful GPU for Groovymame
Post by: philexile on December 21, 2018, 10:57:34 pm
I’m working on a revised GroovyMame build this holiday and wanted to get a new video card.

I saw that the R9 series are compatible with Groovymame and wanted to check to see what the most powerful card would be to get that is fully compatible with A CRT monitor.

I plan to build an i9 9900K machine with a dual boot Windows 10 install. The first Windows 10 install will be for Groovymame and the second will be a “normal” Windows 10 install for modern games and Dolphin. This is why I wanted the most modern card possible.

Any advise would be appreciated.
Title: Re: Most Powerful GPU for Groovymame
Post by: Arroyo on December 21, 2018, 11:33:41 pm

A lot of this was discussed here:

 http://forum.arcadecontrols.com/index.php/topic,158837.msg1670778.html#msg1670778 (http://forum.arcadecontrols.com/index.php/topic,158837.msg1670778.html#msg1670778)
Title: Re: Most Powerful GPU for Groovymame
Post by: schmerzkaufen on December 22, 2018, 04:31:39 am
I'm still wondering what specs really matter for GroovyMAME though.
Clock speed is one thing but the differences in architectures tend to privilege one thing over another, for intance those R9 if I understand have a rather slow clock but were designed with a large bus and might produce bigger bandwidth than other cards of the same level.
But other cards with higher clock and smaller bus (like several RX) might perform better in other areas.
 :dunno
There's even cards specialized in CAD/CAM that perform considerably better than most high-end consumer ones, but at 2D instead of 3D or whatever the deal is, I don't know.


EDIT: just an example to illustrate, here's two clearly different cards, one privileges clock speeds, the other pure processing/calculation.
Which one is better for GM? no idea...
(http://i65.tinypic.com/2wod5rc.png)
Title: Re: Most Powerful GPU for Groovymame
Post by: philexile on December 22, 2018, 12:12:19 pm
Thank you for the feedback. I just ordered a 390X.

I don’t think the card will really matter for GroovyMame much. I want a newer card so that I can use it both with Groovymame and modern games.

I plan to have two separate Windows 10 installs — one with normal drivers/setup for modern games (and Dolphin) and one for the custom Groovymame build.

Title: Re: Most Powerful GPU for Groovymame
Post by: schmerzkaufen on December 22, 2018, 12:22:30 pm
Aren't you going a bit fast here? the 390X (afaik the only one still around is the MSI 390X Gaming 8GB) doesn't seem to feature a DVI-I out, which means no direct analogue output for 15KHz so you will have to wait for the next CRT_Emudriver release and use a DVI>VGA converter.

Otherwise it sure is powerful any way you look at it.
Title: Re: Most Powerful GPU for Groovymame
Post by: philexile on December 22, 2018, 01:36:19 pm
Hello, I ordered this Asus model:

https://www.amazon.com/gp/aw/d/B011D7AAEE/ref=ya_aw_od_pi?ie=UTF8&psc=1 (https://www.amazon.com/gp/aw/d/B011D7AAEE/ref=ya_aw_od_pi?ie=UTF8&psc=1)

It has a DVI port so it should be OK, correct?

Thanks
Title: Re: Most Powerful GPU for Groovymame
Post by: schmerzkaufen on December 22, 2018, 01:50:16 pm
Well this is peculiar since it claims DVI-I and the picture shows DVI-D (digital only)

ASUS website also says it's DVI-D, which is expected when there's only one DVI out. Typically cards featuring DVI-I have two DVI outs, one D and one I.

It's not that you won't be able to use it for 15KHz if there's no DVI-I, Calamity has been preparing the future for this, but it's not out yet and will definitely require a VGA adapter, with results that will only be known with time through community testing and feedback.

EDIT: just to be clear:
(http://i63.tinypic.com/2remgyv.png)
Title: Re: Most Powerful GPU for Groovymame
Post by: philexile on December 22, 2018, 01:56:42 pm
Ah Ok, well if you were buying today, what would you get? I could do eBay, but I’d prefer to get something new.

I have time over the holiday which is why I’m anxious to order now.

Thanks again
Title: Re: Most Powerful GPU for Groovymame
Post by: schmerzkaufen on December 22, 2018, 02:02:17 pm
The most powerful I can find with a DVI-I analogue out (check pic up)

But as I said going for one with analogue out is not compulsory, just that the 'new way' using digital+connverter hasn't gone through the testing phase yet.

Calamity has a Vega 56 so he must be confident that we can live without an analogue out.

EDIT: for the R9 it seems the 380X is the higher end model with some featuring DVI-I, but please take your time searching and comparing.
Title: Re: Most Powerful GPU for Groovymame
Post by: philexile on December 22, 2018, 04:07:28 pm
Thanks for the info. The odd thing about the 390X card that I linked to is that is also has DVI-I marked in the metal underneath of the (typically) DVI-D port.

Is there anyway for analog to work with just the single pin rather than the standard 5?

Thanks again

EDIT: I assume this is a mistake, but it’s just odd.
Title: Re: Most Powerful GPU for Groovymame
Post by: schmerzkaufen on December 22, 2018, 05:26:03 pm
Thanks for the info. The odd thing about the 390X card that I linked to is that is also has DVI-I marked in the metal underneath of the (typically) DVI-D port.
Yes that's really odd I don't know the reason, then look for that model at ASUS and you'll see the detailed specs say DVI-D.
Beware of online stores descriptions, when you look for the specs always check the manufacturer's website with the full part number, here STRIX-R9390X-DC3OC-8GD5-GAMING
https://www.asus.com/us/Graphics-Cards/STRIXR9390XDC3OC8GD5GAMING/specifications/ (https://www.asus.com/us/Graphics-Cards/STRIXR9390XDC3OC8GD5GAMING/specifications/)

Is there anyway for analog to work with just the single pin rather than the standard 5?
With a VGA adapter apparently yes, ask Calamity the details, it's too technical for me to explain.

I understand you'd prefer the 390X as it is clearly more powerful, but there's even more powerful AMD cards available, so it might be worth waiting for the upcoming CRT_Emudriver version and learn more from Calamity who will probably advise on VGA adapters along.
You seem to be in a hurry but when it comes to building a dedicated PC and Groovy/Emudriver's involved you have to think twice before clicking the 'buy' button.
Title: Re: Most Powerful GPU for Groovymame
Post by: philexile on December 22, 2018, 07:00:14 pm
Thanks for the advice!

I think I may have found a good 380 which does have analog for sure. I may end up upgrading to a more powerful GPU depending on what solutions are found. I don’t want to wait because this holiday period is the time I have to get this project launched. :)

I’m using the PC exclusively with a CRT. The “normal” PC will likely run at 720p — though I may try 1080i to see how that looks. With those specs in mind, the 380 may be enough for my purposes. I’m most interested in running Dolphin and a few older PC games like Stalker.

Title: Re: Most Powerful GPU for Groovymame
Post by: keilmillerjr on December 22, 2018, 10:41:12 pm
I don’t understand the point of dual booting. Use crt emudrivers with a crt and play games. :dunno
Title: Re: Most Powerful GPU for Groovymame
Post by: philexile on December 22, 2018, 11:26:12 pm
Well, I had issues with 720p being an “option” for CRTemudriver in the past. Groovymame would sometimes use that resolution in error.

I’d also want to have the current drivers for modern games and Dolphin.

If there is an easier way to do this I’m all ears! :)
Title: Re: Most Powerful GPU for Groovymame
Post by: keilmillerjr on December 22, 2018, 11:49:31 pm
A really old card will work fine for dolphin. I use an hd5450 and had mixed results with newer steam games. King of fighters XII runs fine on good graphics settings and looks/plays amazing. Street Fighter X Tekken has to be on lowest graphics settings. Anything card newer I would imagine be fine for not arcade but arcade worthy titles. If your looking to play some crazy game, you probably want a PlayStation or Xbox.
Title: Re: Most Powerful GPU for Groovymame
Post by: philexile on December 22, 2018, 11:59:39 pm
What kind of monitor are you using though? Is it a CRT?

I have two: an NEC XP 29” that I’ll be using Groovy with mostly and a So y BVM D32 that I’d want to use for Dolphin and some modern games, nothing crazy. I’d also use it with some arcade games.

I don’t want Groovy using the 720p resolution which it has in the past. This is why I want separate installs. Maybe I did something wrong though?

Also if I remember correctly the drivers aren’t the most current and it caused issues with some steam games.
Title: Re: Most Powerful GPU for Groovymame
Post by: schmerzkaufen on December 23, 2018, 04:24:41 am
OK when you wrote 'modern games and dolphin' I thought your second display would be a modern flat panel with a much higher resolution which you would use for today's PC games and heavy consoles emulation.

But this is not the case, so for 720p on a CRT even if it's for Dolphin or the likes you definitely won't need a $300 gpu lol.

Still so for a PC with such a monster CPU (why the i9-9900K by the way?) it would be a shame to pair it with a weak-ass old card.
Title: Re: Most Powerful GPU for Groovymame
Post by: philexile on December 23, 2018, 07:22:23 am
Hello,

The reason I want to use the i9 is to be able to reduce input and audio lag as much as possible. If want to run Frame Delay as high as possible — 9 I think — and have low audio latency with port audio (or ASIO). Both of those require a strong CPU.

Does that make sense now?

Talk soon
Title: Re: Most Powerful GPU for Groovymame
Post by: schmerzkaufen on December 23, 2018, 07:40:38 am
Ah yes of course a faster CPU and more cores is always better (although as Calamity said the latter is theory and applies differently for different drivers)
Title: Re: Most Powerful GPU for Groovymame
Post by: philexile on December 23, 2018, 09:31:18 am
I’m mainly getting it because of the single core performance, but I have read multi core is beginning to be important for Mame.

I want to have this setup for quite a long time so the investment will be worth it for me. :)

Title: Re: Most Powerful GPU for Groovymame
Post by: keilmillerjr on December 23, 2018, 09:42:05 am
OK when you wrote 'modern games and dolphin' I thought your second display would be a modern flat panel with a much higher resolution which you would use for today's PC games and heavy consoles emulation.

But this is not the case, so for 720p on a CRT even if it's for Dolphin or the likes you definitely won't need a $300 gpu lol.

Still so for a PC with such a monster CPU (why the i9-9900K by the way?) it would be a shame to pair it with a weak-ass old card.

This is what I thought as well.

I use a wells gardener k7000. For 720p games, I use a 360 modeline.
Title: Re: Most Powerful GPU for Groovymame
Post by: philexile on December 23, 2018, 11:55:05 am
Maybe I’ll try that then. I would much prefer a single Windows 10 install.

Are you able to play games through Steam without any issues?

The problem I would run into with GM was that I could not get GM to ignore the 720p resolutions.

Thanks
Title: Re: Most Powerful GPU for Groovymame
Post by: keilmillerjr on December 23, 2018, 04:26:43 pm
Steam is like a store. Games can be launched independently. I’ve only tried a few games. Have been all set if it’s a game with keyboard support and a max of 4 buttons. Each game needs to be looked at individually.
Title: Re: Most Powerful GPU for Groovymame
Post by: schmerzkaufen on December 24, 2018, 04:31:22 am
I want to have this setup for quite a long time so the investment will be worth it for me. :)

CPU development seems to have hit a wall in recent years, so unless Intel/AMD discover some new technology there won't be a leap forward, I mean this processor could last you almost a decade before it's really obsolated.
Title: Re: Most Powerful GPU for Groovymame
Post by: philexile on December 24, 2018, 08:53:11 am
Fantastic, that would be perfect for me.

Also, thanks to everyone for the advice!
Title: Re: Most Powerful GPU for Groovymame
Post by: Torkyo on January 17, 2019, 12:20:57 pm
Hello,
nice thread.

As far as I know, the frame delay option of groovymame is strictly related to power of the single core of the used processor.  so even if a multi core like the i9-9900k (CPU Mark 20174) is 3 times faster than a i5-3470 @ 3.20GHz (CPU Mark 6705), although the latter CPU is even dated Q2 2012, if we compare them looking the Single Thread Rating we have that the i9-9900k it is just only 33% faster than the i5-3470. So I was wondering if a beast of CPU like the philexile's one can give other advantages on groovymame ground and if yes, in which ways.
thanks
Title: Re: Most Powerful GPU for Groovymame
Post by: schmerzkaufen on January 17, 2019, 02:39:36 pm
Quickly and some will correct me if I'm wrong:

- Depending on the games MAME uses up to 8 cores and more if necessary for emulation, not just a single one. How multi-core affects frame_delay performance though? dunno for sure. Maybe it does indirectly in terms of timing since other threads like the inputs might be taken care of elsewhere (Calamity mentioned something like that)

- AFAIK GroovyMAME relies both on the CPU and GPU, the latter does matter. How much though? I don't know for sure either, but I've been suspecting mine is what's holding me from using higher frame_delay values like a stable 8 or 9, for a number games.
I suspect so because when I OC'd my i5-4690k to 4.2GHz it allowed me to run more games at 100%, but not really to push frame_delay higher as far as I could experience.
(anyway I also suspect there are other things getting in the way, the highest frame_delay settings might not always be fit or desirable for some reason)

- 33% more STP is a lot, and don't forget CPUs like that i9 can boost/overclock past 5GHz! (passmark don't quote OC scores) I think though that for most games you can probably reach the best performance with lesser CPUs (for instance an overclocked i3-8350k), the 9900k though will probably have the edge with the heaviest and most threads-heavy hardwares you can emulate with MAME.

Title: Re: Most Powerful GPU for Groovymame
Post by: zerochad on January 17, 2019, 05:50:41 pm
On my AstroCity conversion, I'm using an i7 8GB, HD 7570 and I can play Gamecube (Dolphin via Retroarch) and PS2 games at their native resolutions to the CRT. Full speed as well, just got to be smart about what settings etc you use in each emulator. Since the resolution is 640 x 480, it doesn't need too much grunt power from the GPU. I think most of is with the CPU. If you're up-scaling to 720p then yeah you might need something a bit beefier for the graphics side.
Title: Re: Most Powerful GPU for Groovymame
Post by: schmerzkaufen on January 17, 2019, 07:01:48 pm
His question was about frame_delay specifically, not about getting full speed in games or other emulators. Also sure we should have mentioned that requirements might be different whether a CRT or a higher resolution (specifically flat panel) is used in the setup...
Title: Re: Most Powerful GPU for Groovymame
Post by: Paradroid on January 17, 2019, 07:40:35 pm
AFAIK GroovyMAME relies both on the CPU and GPU, the latter does matter. How much though? I don't know for sure either, but I've been suspecting mine is what's holding me from using higher frame_delay values like a stable 8 or 9, for a number games.

My understanding is that the GPU only became an issue once everyone starting using the super resolutions and frame delay combination: 2560 wide resolutions requires the GPU to scale from native (e.g. 320, 384, etc.) and, with frame delay, at even faster speeds (since that function creates a "just in time" scenario).

If you don't use super resolutions, you won't need such a powerful GPU.
Title: Re: Most Powerful GPU for Groovymame
Post by: schmerzkaufen on January 17, 2019, 09:50:39 pm
Well lots of people use super resolutions it seems.

In my configuration I'm not using super resolutions but a fixed 1920x1080 display, and as it seems even 'just' that might require more than my R7 260x if I want to reach frame_delay 8 or 9.

Maybe the 370 at the end of the R7 series being a bit superior would do, but there's little price differences in the realm of used GPUs at that level, so I guess I could grab a used RX 480 or 580 and with what's easily over twice the overall power, finally max out frame_delay where I want/possible.
(the R9 380/X is more versatile since it also supports analogue if I ever want to use it for a CRT setup later, but the price and availability aren't too good around here)

Now with a WQHD or 4K display, I wonder if even those mid-range cards would be enough...
Title: Re: Most Powerful GPU for Groovymame
Post by: Torkyo on January 18, 2019, 09:18:35 am
Thanks for your answers!

For what I concern I got a  i5-3470 and basically with most of the games I can stay stick to FD set to 8 without any slow down in pixel perfect conditions of shape and game speed. with the very most of the golden age games I can stay on FD 9 without slow downs.
As said already in a different thread, using games like Shienryu I can't even set FD to 1 as it immediately have huge slow down (CPU down to 50-60%) as this kind of game is very CPU demanding and I'm wondering if the  philexile's rig can handle this game with frame delay on.
Please note that with FD set to 0 Shienryu is perfect. input lag of 1or 2 frames but perfect speed and no slow down in any condition.
I do think that the single core power is a key factor in this otherwise we would have a huge difference between the i5 3470 and the I9 9900 being the latter 300% faster than the i5 in question.
To be precise I'm using CRT and super resolutions on both horizontal and vertical games. to be said that using super resolution for vertical games (as explained by Calamity) requires a good CPU GPU capable of bandwidth otherwise you will be suffering of tearing (I want to underline we are talking about tearing, no slow down. the speed is stick to 100% and no slow down, but a and stable tear on the top of the screen). to use super resolution with vertical games with more than 288 lines, a 4350 is not enough strong and it will be suffering of tearing. the only way here is to use standard resolutions as explained to me (it worked).
Anyway I upgraded my system with a 7850 which have about the double of the 2D speed of a 4350 and I can use super resolutions without tearing also with vertical games.
Must say that with Shienriyu, no matter I use standard or super resolutions and no matter I use a 4350 or a 7850 the result with frame delay is precisely the same in both the cases. if I set it to just only FD to 1, I suffer slow down. so I think I can safely say that the GPU have no incidence in the frame delay's matter.  it is just about the CPU and most luckily about the mere speed of the single core.
Title: Re: Most Powerful GPU for Groovymame
Post by: schmerzkaufen on January 18, 2019, 10:08:06 am
The setups are different and I'm betting the requirements are too, we shouldn't compare like this. My setup being overall more powerful than yours both in CPU and GPU power but running a LCD is proof enough,

I also stand that there isn't a single main thread for emulation and that's probably the thing, especially for a hardware like STV, that might explain why frame_delay can't work with some games, I can't picture how the CPU<>GPU frame_delay mechanics works here.
When you say 'single core' I bet you mean 'single thread' performance here, but these don't have the same meaning, you may be talking about STP and speed which goes without saying is important, but the number of threads/cores used by the emulator and the role of the GPU in this can't be ignored. However it's not explained my guess is that depending on the setup, configuration, and games, there are times where the GPU power matters more than we probably realize, and times where CPU power matters less than we imagine (yes I'm still precisely talking about frame_delay, just a general reminder so that we don't deviate).
All of this is only speculation but heh. ^^
We just never received a detailed explanation, probably because it's too hard to understand for most people, but maybe if we could get the essentials that would help us choose our PC hardware more accordingly to our needs. It's also possibe that there isn't a relatable or general explanation to give yet, and that it's somehow through experience and kind of benchmarking that we'll see a pattern appear.
Unfortunately even just baseline MAME, which is more straightforward than Groovy, doesn't receive nearly enough feeback on the topic of performance and requirements, so here being quite niche it'll be even tougher and slower, I think.

In regards to Shienryu again it's STV, what applies to what must be a majority of the hardwares emulated by MAME probably doesn't for systems so complex, and therefore as I've just theorized, maybe it doesn't in regards to Groovy as well, since it's a fairly different build.
Testing STV games to see if frame_delay works any at all and at what values was part of what I planned to ask Philexile not long ago (then I got distracted/busy with things), Shienryu, VF2, Cotton Boomerang cutscenes, Radiant Silvergun for instance.
Calamity should be done with his new rig soon (maybe?) so I bet he should be able to inform us of what results he gets from that Vega (dunno which CPU he got).
Title: Re: Most Powerful GPU for Groovymame
Post by: Torkyo on January 18, 2019, 07:26:38 pm
Thanks schmerzkaufen, I got what you mean.
Sorry for my want of inaccuracy in my words but effectively I'm definitely not an expert in such matter (and in many others!). Anyway, yes, I was meaning Single thread, not single core.
I understand that generally speaking it is impossible to define the incidence of CPU and GPU anyway I think that the faster the better, the point here is to understand what is really faster.
My point is just to understand for me and for my pockets if it is worth to spend a lot of money for a brand new CPU and MOBO to have the lowest input delay possible from software side.
The point is to understand if in Groovymame this upgrade (for example from I5 4370 to I9 9900) gives an advantage of 300% or 30%. In one case I would to the upgrade, in the other one I wouldn't.
Title: Re: Most Powerful GPU for Groovymame
Post by: schmerzkaufen on January 19, 2019, 04:58:35 am
Lol I doubt I know much more than you do, just been around a bit longer and unlike a good portion of this community's members I don't posess even a tiny fraction of the knowledge that would give me hopes to one day actually understand how all this stuff works, I will only ever see the tip of the iceberg.  ;D

Still I persist because of two reasons:
- GroovyMAME compared to the official MAME and other alt builds, can indeed provide a more accurate experience of the games, it's not pretend, and I've witnessed here its advantages are not limited to CRT setups.
- Arcade pcbs like most retro games have become collector's items worth hundreds or thousands, even tens of thousands, so if there's a way to play with emulation that's realistically closer to the real thing, it is worth using and supporting, because IMO it is completing, giving more meaning and sense to the flawed/incomplete definition of preservation that the official MAME stands with.

So what are a few hundred, what's even about a thousand bucks and a half, if we can have this now ? I don't understand the guys with a Pi or a those who buy the cheapest weakest old hardware thy can find, yet later complain about the limitations. Like with PC gaming, better performance in emulation costs money, I have accepted that a long time ago.
Also I'm not going to wait 10 or 20 years until computer hardware that can do the job will cost only a fraction of today's price.
Not waiting another 10 or 20 years also, that MAME fixes already decades-old issues if acceptable workarounds exist.
(plus honestly I'm getting too old for this ---steaming pile of meadow muffin---, I give myself another decade of caring about it, at best)

Sure it would be nice if we knew more so we could build rigs with more precision and weigh the value for money for each part, but you can't go wrong with more power anyway and it's still considerably cheaper than buying arcade boards, but I'm being redundant.

The i9-9900k might be too much for a lot of users, but as I said Philexile, it's the kind of purchase you make also if you intend to keep a rig relevant for many, many years, and this CPU will likely be considered still strong even in 10 years.
Comparatively I'd say my i5-4690k which is 5 years old is still technically relevant but despite the overclocking ability it has entered the obsolescence portion of the curve (same price I paid 5y ago, today provides more performance/$, the i3-8350k beats it)

But the topic here is GPUs, and frame_delay, which we're having a harder time rationalizing...
Title: Re: Most Powerful GPU for Groovymame
Post by: cools on January 23, 2019, 10:59:37 am
I believe frame_delay will become irrelevant once frame_slice becomes a thing across the board?
Title: Re: Most Powerful GPU for Groovymame
Post by: schmerzkaufen on January 24, 2019, 03:42:24 am
IIRC it works only for a handful of drivers, you'd need to rewrite most MAME drivers to be line-accurate if you wished frame_slice to become a common thing.
No need to say it won't happen, mamedev just won't redo a massive part of the work they did in 20 years, so frame_delay is still very relevant because it works with most drivers.
And in its own fashion run-ahead is too, both methods are almost universal but demanding on the hardware.
I think Calamity said he suspects frame_slice still might help to some extent though, even if not fully supported. Dunno the status of his R&D...
Title: Re: Most Powerful GPU for Groovymame
Post by: Calamity on January 24, 2019, 04:03:04 am
IIRC it works only for a handful of drivers, you'd need to rewrite most MAME drivers to be line-accurate if you wished frame_slice to become a common thing.
No need to say it won't happen, mamedev just won't redo a massive part of the work they did in 20 years, so frame_delay is still very relevant because it works with most drivers.
And in its own fashion run-ahead is too, both methods are almost universal but demanding on the hardware.
I think Calamity said he suspects frame_slice still might help to some extent though, even if not fully supported. Dunno the status of his R&D...

From MAMEdev point of view, frame slice is the right, "acceptable" way of doing it. In other words, fixing drivers to play nice with frame slice will mean improving the accuracy of those drivers. It doesn't mean to redo all the work, but it's definitely a daunting task.

Frame slice does all what frame delay does and more. So yes, once frame slice is a reality frame delay will be irrelevant. It is superior to run ahead too (which is a wrong approach to the problem), although frame slice can't compensate the latency of crappy monitors the way run ahead does.

However, as you say frame delay currently has the advantage to work with all existing drivers, and there's room for improvement once some new developments take place, e.g. to make it independent of syncrefresh so it can be used on Freesync monitors too.

Title: Re: Most Powerful GPU for Groovymame
Post by: schmerzkaufen on January 24, 2019, 06:12:59 am
From MAMEdev point of view, frame slice is the right, "acceptable" way of doing it. In other words, fixing drivers to play nice with frame slice will mean improving the accuracy of those drivers. It doesn't mean to redo all the work, but it's definitely a daunting task.
Still I don't believe they will help at all... don't tell me you're planning to do it all by yourself in the future!? you will make people worry about your health.  :laugh:

... run ahead too (which is a wrong approach to the problem), although frame slice can't compensate the latency of crappy monitors the way run ahead does.
About run-ahead the monitor question rarely comes on the table (most ppl's understanding of displays lag is rather confused anyway), same lack of interest in regards to accuracy bc ppl don't think the way devs do. Rather the main offense is that run-ahead allows to play at a latency lower than the game's original, as run-ahead is perceived by the public that's the sticking point opposing honest players to idiots. Had run-ahead featured a built-in safeguard, or at least a non-removable warning triggered by overkill settings, the critics wouldn't weigh much because honestly as a method for reducing unwanted lag it is otherwise not any worse than frame_delay at the user's end. I mean in the context of RA as a multi-emu frontend it is a working, quite universal solution.
Rather what went wrong is that it was brought in by the wrong people and as a consequence was instantly misused.

However, as you say frame delay currently has the advantage to work with all existing drivers, and there's room for improvement once some new developments take place, e.g. to make it independent of syncrefresh so it can be used on Freesync monitors too.
Always looking forward to future developments of course, frame_delay is my favourite because you fight to make it so it works combined with all the other good features (portaudio, emudriver, hlsl, sliders, etc), so it does not lay there as a separate gimmick but a fully integrated powerful feature specifically for MAME and up-to-date with it. That's why I have no issues with spending money on decent PC hardware to make it all work.  :P
Title: Re: Most Powerful GPU for Groovymame
Post by: cools on January 24, 2019, 06:42:07 am
I mean no disrespect but I think your view of MAMEdev is being clouded by your acknowledged lack of technical know how and frustration at how things that you feel are critically important are prioritised differently by the developers.

They've rewritten most of the code base multiple times when a new technology has been adopted, or simply to adopt better development practices. There's constant work on improving the way it all hangs together - which is all visible by from reading the github commit history.

Adopting frame_slice is exactly the kind of thing I'd expect them to undertake, and probably sooner than you'd expect.
Title: Re: Most Powerful GPU for Groovymame
Post by: schmerzkaufen on January 24, 2019, 07:37:08 am
What you say implies some misunderstandings and lack of some relevant info, but please if you've read me in other threads where this has derailed already, please really, really don't trigger me on the mamedev topic.

****

Anyway let's continue this thread, has Philexile and his i9-9900k + R9 380 tried some challenging games like the STV titles since then ?  :P
Title: Re: Most Powerful GPU for Groovymame
Post by: schmerzkaufen on January 27, 2019, 04:38:33 am
So since there isn't much enthusiasm :P for investigating GPU performance VS. frame_delay, especially in regards to LCD use in this case, I've ordered a couple more used AMD's and will attempt a comparison myself if the purchases went well (bad used cards aren't too rare unfortunately and eBay has become increasingly shady no matter the good seller ratings)

Although not massively apart, an R7 370 and a R9 380X which I will put against the R7 260X and GTX 750 Ti I already own, over 1920x1080 and 1600x1200 displays.
Though I don't know if there's much difference in bandwidth requirements bteween these two resolutions, too bad I don't own a WQHD yet, but I will probably get one in the future after reselling at least two of these cards etc.
Don't count on me for 4K and more powerful cards though, if you want to know the thresholds you'll have to ask 4K + high-end cards owners.

Waiting for these cards and GM 0.206 to start then, because I want to test with HLSL on too.
Title: Re: Most Powerful GPU for Groovymame
Post by: Torkyo on January 28, 2019, 10:10:47 am
Lol I doubt I know much more than you do, just been around a bit longer and unlike a good portion of this community's members I don't posess even a tiny fraction of the knowledge that would give me hopes to one day actually understand how all this stuff works, I will only ever see the tip of the iceberg.  ;D

Still I persist because of two reasons:
- GroovyMAME compared to the official MAME and other alt builds, can indeed provide a more accurate experience of the games, it's not pretend, and I've witnessed here its advantages are not limited to CRT setups.
- Arcade pcbs like most retro games have become collector's items worth hundreds or thousands, even tens of thousands, so if there's a way to play with emulation that's realistically closer to the real thing, it is worth using and supporting, because IMO it is completing, giving more meaning and sense to the flawed/incomplete definition of preservation that the official MAME stands with.

So what are a few hundred, what's even about a thousand bucks and a half, if we can have this now ? I don't understand the guys with a Pi or a those who buy the cheapest weakest old hardware thy can find, yet later complain about the limitations. Like with PC gaming, better performance in emulation costs money, I have accepted that a long time ago.
Also I'm not going to wait 10 or 20 years until computer hardware that can do the job will cost only a fraction of today's price.
Not waiting another 10 or 20 years also, that MAME fixes already decades-old issues if acceptable workarounds exist.
(plus honestly I'm getting too old for this ---steaming pile of meadow muffin---, I give myself another decade of caring about it, at best)

Sure it would be nice if we knew more so we could build rigs with more precision and weigh the value for money for each part, but you can't go wrong with more power anyway and it's still considerably cheaper than buying arcade boards, but I'm being redundant.

The i9-9900k might be too much for a lot of users, but as I said Philexile, it's the kind of purchase you make also if you intend to keep a rig relevant for many, many years, and this CPU will likely be considered still strong even in 10 years.
Comparatively I'd say my i5-4690k which is 5 years old is still technically relevant but despite the overclocking ability it has entered the obsolescence portion of the curve (same price I paid 5y ago, today provides more performance/$, the i3-8350k beats it)

But the topic here is GPUs, and frame_delay, which we're having a harder time rationalizing...

Sorry schmerzkaufen! been bloody busy and I could't reply as fast as as I wanted! in the while I've see the thread moved on (very good reading btw) and maybe as you rightly said we are going a bit off topic here: I'm waiting to listen to something from Philexile about STV titles!
Anyway thanks, I got your points.
Title: Re: Most Powerful GPU for Groovymame
Post by: schmerzkaufen on January 28, 2019, 02:14:06 pm
Sorry schmerzkaufen! been bloody busy and I could't reply as fast as as I wanted!
Don't be sorry it's ok, these are slow niche forums, we can leave a week or two between replies no one will complain (afaik no one in the community is showing signs of early alzheimer's yet so that pace is fine)  ;D
You know I had planned to test a pile of things with GM and various hardware, I have a todo list (yes) I wanted to attack by the end of last november, well its almost february. :p
Other factor is that Calamity makes Groovy progress quite fast and when I think ok lets do this he's already a step above with more improvements and fixes.
Title: Re: Most Powerful GPU for Groovymame
Post by: Zebra on January 30, 2019, 06:08:50 pm
I think that virtual machines is the way to go if you want to run multiple emulators on one PC.

My experience has been that mame does not make proper use of the GPU yet (although I read they were working on it) but plenty of other emulators do.

Groovy mame works great with an old $10 eBay card like a 5450 if you have a decent CPU but Naomi emulators barely run without something better. Even the model 2 and 3 stutter on my machine with the 5450.

Then there are the slightly newer PC based arcade games. Some of these can now be played with virtual box images but still need a decent GPU.

So... investing in one decent desktop and being able to use one image to run GM on a CRT with a low spec card and a second image to run everything else with a newer high spec card seems optimal. Or... run two machines if space and money is no object.
Title: Re: Most Powerful GPU for Groovymame
Post by: schmerzkaufen on January 30, 2019, 06:59:38 pm
Well (for the nth time! ;D) the topic is not MAME's performance with GPUs,, but GroovyMAME's frame_delay performance with GPUs, under various configurations.

Unlike baseline MAME that uses the GPU only for video output and shaders, Groovy needs some of its actual processing power for lag reduction. And with higher resolution displays apparently the requirements are many steps above that of low res CRTs, I will focus on the former.

Anyway tomorrow I'll go fetch that R9 380X, fingers crossed it works fine (didn't pay much for it and I have trust issues when it comes to used PC hardware, in particular from ebay), then I can begin my little benchmarking adventure, maybe with a chart it'll become clearer to readers what this is about.

It'll take some time tho...

EDIT: damn Murphy's Law really is powerful these days, especially when buying off eBay
- received both cards, a 380x and... a 380 instead of the 370 I ordered. that 380 is a 'small' one comparable a boosted 370, but the power requirements are still that of a 380, too close to the 380x naturally, so rather redundant
- both require 2x 6pin pcie and I didn't realize I'm short one cable (of course none included in the boxes)
:/
Gonna pick the one that works best and resell the other quickly. I hate eBay but here it's really the last remaining place to find a broad variety of old hadware stuff, our local amazon isn't furnished enough in used and refurbished goods anymore (or not that old)
I'll limit my benchmarking to three cards then, 750Ti, 260X and one of the 380's, should be enough anyway.
Title: Re: Most Powerful GPU for Groovymame
Post by: schmerzkaufen on February 08, 2019, 08:51:28 am
First experience today with a RX 570 (CRT_Emudriver 2.0 beta 15 Win 7 64) on a 1080p LCD.

Coming from a R7 260X, at first sight (only quickly tried a couple of games) it doesn't seem the 570 makes a significant difference in regards to how high I can set frame_delay, games seem to have a frame_delay max threshold period.

Rather it's when I use vsync_offset along with HLSL that something's changed a bit: I either don't see the tearing line, or it's less pronounced.
In practice it seems that in cases this will allow me to maintain the same frame_delay level whether HLSL is on or off (while with the 260X I always have to drop frame_delay 1 step in order to maintain a stable HLSL'd picture)

This is only preliminary observation though, no rushed conclusion (to be continued...)
Title: Re: Most Powerful GPU for Groovymame
Post by: Calamity on February 08, 2019, 10:07:16 am
First experience today with a RX 570 (CRT_Emudriver 2.0 beta 15 Win 7 64) on a 1080p LCD.

Coming from a R7 260X, at first sight (only quickly tried a couple of games) it doesn't seem the 570 makes a significant difference in regards to how high I can set frame_delay, games seem to have a frame_delay max threshold period.

Rather it's when I use vsync_offset along with HLSL that something's changed a bit: I either don't see the tearing line, or it's less pronounced.
In practice it seems that in cases this will allow me to maintain the same frame_delay level whether HLSL is on or off (while with the 260X I always have to drop frame_delay 1 step in order to maintain a stable HLSL'd picture)

This is only preliminary observation though, no rushed conclusion (to be continued...)

Yes, that makes total sense.

Frame delay is a cpu hog for the most part (it's not that it uses all your CPU, it's that it requires a very fast CPU in order to keep frame emulation time as short as possible).

However, once you start using higher resolutions, the rendering process itself starts having more impact. And once the rendering time becomes longer than the vertical retrace time, you start seeing static tearing. You compensate for that with vsync offset, but vsync offset can be seen as the opposite to frame delay: raising vysnc offset is like lowering frame delay (it's a bit more complicated but you get the idea). So the fastest your video card is, the less need for vysnc offset and the more effective frame delay will be.
Title: Re: Most Powerful GPU for Groovymame
Post by: schmerzkaufen on February 08, 2019, 10:33:08 am
I think I understand most of what you said, so while the benefit of a more powerful card here is not directly with the highest frame delay level you can achieve with a game, it is rather in over how much more display resolution you can achieve it, also with post-processing on top adding its weight, without the drawbacks (tearing and vsync_offset)

Too bad I don't have a higher-resolution display right now to make my testing more meaningful.
Well...I have a 1600x1200p in storage, would that make much of a perceptible defference in a stress test ?