Main Restorations Software Audio/Jukebox/MP3 Everything Else Buy/Sell/Trade
Project Announcements Monitor/Video GroovyMAME Merit/JVL Touchscreen Meet Up Retail Vendors
Driving & Racing Woodworking Software Support Forums Consoles Project Arcade Reviews
Automated Projects Artwork Frontend Support Forums Pinball Forum Discussion Old Boards
Raspberry Pi & Dev Board controls.dat Linux Miscellaneous Arcade Wiki Discussion Old Archives
Lightguns Arcade1Up --- Bug Reports --- Site News

Unread posts | New Replies | Recent posts | Rules | Chatroom | Wiki | File Repository | RSS | Submit news

  

Author Topic: Most Powerful GPU for Groovymame  (Read 3323 times)

0 Members and 1 Guest are viewing this topic.

schmerzkaufen

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 580
  • Last login:Today at 09:10:20 am
  • I want a large cream coffee
Re: Most Powerful GPU for Groovymame
« Reply #40 on: January 24, 2019, 07:37:08 am »
What you say implies some misunderstandings and lack of some relevant info, but please if you've read me in other threads where this has derailed already, please really, really don't trigger me on the mamedev topic.

****

Anyway let's continue this thread, has Philexile and his i9-9900k + R9 380 tried some challenging games like the STV titles since then ?  :P
« Last Edit: January 24, 2019, 07:43:20 am by schmerzkaufen »
GroovyMAME oddball LCD user: W7 64, viewsonic vx3211-mh, i5-4690k @4.1GHz, Rx 570, crt_emudriver 2.0b15

schmerzkaufen

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 580
  • Last login:Today at 09:10:20 am
  • I want a large cream coffee
Re: Most Powerful GPU for Groovymame
« Reply #41 on: January 27, 2019, 04:38:33 am »
So since there isn't much enthusiasm :P for investigating GPU performance VS. frame_delay, especially in regards to LCD use in this case, I've ordered a couple more used AMD's and will attempt a comparison myself if the purchases went well (bad used cards aren't too rare unfortunately and eBay has become increasingly shady no matter the good seller ratings)

Although not massively apart, an R7 370 and a R9 380X which I will put against the R7 260X and GTX 750 Ti I already own, over 1920x1080 and 1600x1200 displays.
Though I don't know if there's much difference in bandwidth requirements bteween these two resolutions, too bad I don't own a WQHD yet, but I will probably get one in the future after reselling at least two of these cards etc.
Don't count on me for 4K and more powerful cards though, if you want to know the thresholds you'll have to ask 4K + high-end cards owners.

Waiting for these cards and GM 0.206 to start then, because I want to test with HLSL on too.
GroovyMAME oddball LCD user: W7 64, viewsonic vx3211-mh, i5-4690k @4.1GHz, Rx 570, crt_emudriver 2.0b15

Torkyo

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 19
  • Last login:November 10, 2019, 04:22:50 pm
  • I want to build my own arcade controls!
Re: Most Powerful GPU for Groovymame
« Reply #42 on: January 28, 2019, 10:10:47 am »
Lol I doubt I know much more than you do, just been around a bit longer and unlike a good portion of this community's members I don't posess even a tiny fraction of the knowledge that would give me hopes to one day actually understand how all this stuff works, I will only ever see the tip of the iceberg.  ;D

Still I persist because of two reasons:
- GroovyMAME compared to the official MAME and other alt builds, can indeed provide a more accurate experience of the games, it's not pretend, and I've witnessed here its advantages are not limited to CRT setups.
- Arcade pcbs like most retro games have become collector's items worth hundreds or thousands, even tens of thousands, so if there's a way to play with emulation that's realistically closer to the real thing, it is worth using and supporting, because IMO it is completing, giving more meaning and sense to the flawed/incomplete definition of preservation that the official MAME stands with.

So what are a few hundred, what's even about a thousand bucks and a half, if we can have this now ? I don't understand the guys with a Pi or a those who buy the cheapest weakest old hardware thy can find, yet later complain about the limitations. Like with PC gaming, better performance in emulation costs money, I have accepted that a long time ago.
Also I'm not going to wait 10 or 20 years until computer hardware that can do the job will cost only a fraction of today's price.
Not waiting another 10 or 20 years also, that MAME fixes already decades-old issues if acceptable workarounds exist.
(plus honestly I'm getting too old for this ---steaming pile of meadow muffin---, I give myself another decade of caring about it, at best)

Sure it would be nice if we knew more so we could build rigs with more precision and weigh the value for money for each part, but you can't go wrong with more power anyway and it's still considerably cheaper than buying arcade boards, but I'm being redundant.

The i9-9900k might be too much for a lot of users, but as I said Philexile, it's the kind of purchase you make also if you intend to keep a rig relevant for many, many years, and this CPU will likely be considered still strong even in 10 years.
Comparatively I'd say my i5-4690k which is 5 years old is still technically relevant but despite the overclocking ability it has entered the obsolescence portion of the curve (same price I paid 5y ago, today provides more performance/$, the i3-8350k beats it)

But the topic here is GPUs, and frame_delay, which we're having a harder time rationalizing...

Sorry schmerzkaufen! been bloody busy and I could't reply as fast as as I wanted! in the while I've see the thread moved on (very good reading btw) and maybe as you rightly said we are going a bit off topic here: I'm waiting to listen to something from Philexile about STV titles!
Anyway thanks, I got your points.

schmerzkaufen

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 580
  • Last login:Today at 09:10:20 am
  • I want a large cream coffee
Re: Most Powerful GPU for Groovymame
« Reply #43 on: January 28, 2019, 02:14:06 pm »
Sorry schmerzkaufen! been bloody busy and I could't reply as fast as as I wanted!
Don't be sorry it's ok, these are slow niche forums, we can leave a week or two between replies no one will complain (afaik no one in the community is showing signs of early alzheimer's yet so that pace is fine)  ;D
You know I had planned to test a pile of things with GM and various hardware, I have a todo list (yes) I wanted to attack by the end of last november, well its almost february. :p
Other factor is that Calamity makes Groovy progress quite fast and when I think ok lets do this he's already a step above with more improvements and fixes.
GroovyMAME oddball LCD user: W7 64, viewsonic vx3211-mh, i5-4690k @4.1GHz, Rx 570, crt_emudriver 2.0b15

Zebra

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 372
  • Last login:Today at 12:02:08 pm
  • I want to build my own arcade controls!
Re: Most Powerful GPU for Groovymame
« Reply #44 on: January 30, 2019, 06:08:50 pm »
I think that virtual machines is the way to go if you want to run multiple emulators on one PC.

My experience has been that mame does not make proper use of the GPU yet (although I read they were working on it) but plenty of other emulators do.

Groovy mame works great with an old $10 eBay card like a 5450 if you have a decent CPU but Naomi emulators barely run without something better. Even the model 2 and 3 stutter on my machine with the 5450.

Then there are the slightly newer PC based arcade games. Some of these can now be played with virtual box images but still need a decent GPU.

So... investing in one decent desktop and being able to use one image to run GM on a CRT with a low spec card and a second image to run everything else with a newer high spec card seems optimal. Or... run two machines if space and money is no object.

schmerzkaufen

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 580
  • Last login:Today at 09:10:20 am
  • I want a large cream coffee
Re: Most Powerful GPU for Groovymame
« Reply #45 on: January 30, 2019, 06:59:38 pm »
Well (for the nth time! ;D) the topic is not MAME's performance with GPUs,, but GroovyMAME's frame_delay performance with GPUs, under various configurations.

Unlike baseline MAME that uses the GPU only for video output and shaders, Groovy needs some of its actual processing power for lag reduction. And with higher resolution displays apparently the requirements are many steps above that of low res CRTs, I will focus on the former.

Anyway tomorrow I'll go fetch that R9 380X, fingers crossed it works fine (didn't pay much for it and I have trust issues when it comes to used PC hardware, in particular from ebay), then I can begin my little benchmarking adventure, maybe with a chart it'll become clearer to readers what this is about.

It'll take some time tho...

EDIT: damn Murphy's Law really is powerful these days, especially when buying off eBay
- received both cards, a 380x and... a 380 instead of the 370 I ordered. that 380 is a 'small' one comparable a boosted 370, but the power requirements are still that of a 380, too close to the 380x naturally, so rather redundant
- both require 2x 6pin pcie and I didn't realize I'm short one cable (of course none included in the boxes)
:/
Gonna pick the one that works best and resell the other quickly. I hate eBay but here it's really the last remaining place to find a broad variety of old hadware stuff, our local amazon isn't furnished enough in used and refurbished goods anymore (or not that old)
I'll limit my benchmarking to three cards then, 750Ti, 260X and one of the 380's, should be enough anyway.
« Last Edit: January 31, 2019, 11:59:48 am by schmerzkaufen »
GroovyMAME oddball LCD user: W7 64, viewsonic vx3211-mh, i5-4690k @4.1GHz, Rx 570, crt_emudriver 2.0b15

schmerzkaufen

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 580
  • Last login:Today at 09:10:20 am
  • I want a large cream coffee
Re: Most Powerful GPU for Groovymame
« Reply #46 on: February 08, 2019, 08:51:28 am »
First experience today with a RX 570 (CRT_Emudriver 2.0 beta 15 Win 7 64) on a 1080p LCD.

Coming from a R7 260X, at first sight (only quickly tried a couple of games) it doesn't seem the 570 makes a significant difference in regards to how high I can set frame_delay, games seem to have a frame_delay max threshold period.

Rather it's when I use vsync_offset along with HLSL that something's changed a bit: I either don't see the tearing line, or it's less pronounced.
In practice it seems that in cases this will allow me to maintain the same frame_delay level whether HLSL is on or off (while with the 260X I always have to drop frame_delay 1 step in order to maintain a stable HLSL'd picture)

This is only preliminary observation though, no rushed conclusion (to be continued...)
GroovyMAME oddball LCD user: W7 64, viewsonic vx3211-mh, i5-4690k @4.1GHz, Rx 570, crt_emudriver 2.0b15

Calamity

  • Moderator
  • Trade Count: (0)
  • Full Member
  • *****
  • Offline Offline
  • Posts: 6772
  • Last login:Today at 10:36:41 am
  • Quote me with care
Re: Most Powerful GPU for Groovymame
« Reply #47 on: February 08, 2019, 10:07:16 am »
First experience today with a RX 570 (CRT_Emudriver 2.0 beta 15 Win 7 64) on a 1080p LCD.

Coming from a R7 260X, at first sight (only quickly tried a couple of games) it doesn't seem the 570 makes a significant difference in regards to how high I can set frame_delay, games seem to have a frame_delay max threshold period.

Rather it's when I use vsync_offset along with HLSL that something's changed a bit: I either don't see the tearing line, or it's less pronounced.
In practice it seems that in cases this will allow me to maintain the same frame_delay level whether HLSL is on or off (while with the 260X I always have to drop frame_delay 1 step in order to maintain a stable HLSL'd picture)

This is only preliminary observation though, no rushed conclusion (to be continued...)

Yes, that makes total sense.

Frame delay is a cpu hog for the most part (it's not that it uses all your CPU, it's that it requires a very fast CPU in order to keep frame emulation time as short as possible).

However, once you start using higher resolutions, the rendering process itself starts having more impact. And once the rendering time becomes longer than the vertical retrace time, you start seeing static tearing. You compensate for that with vsync offset, but vsync offset can be seen as the opposite to frame delay: raising vysnc offset is like lowering frame delay (it's a bit more complicated but you get the idea). So the fastest your video card is, the less need for vysnc offset and the more effective frame delay will be.
Important note: posts reporting GM issues without a log will be IGNORED.
Steps to create a log:
 - From command line, run: groovymame.exe -v romname >romname.txt
 - Attach resulting romname.txt file to your post, instead or pasting it.

CRT Emudriver, VMMaker & Arcade OSD downloads, documentation and discussion:  Eiusdemmodi

schmerzkaufen

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 580
  • Last login:Today at 09:10:20 am
  • I want a large cream coffee
Re: Most Powerful GPU for Groovymame
« Reply #48 on: February 08, 2019, 10:33:08 am »
I think I understand most of what you said, so while the benefit of a more powerful card here is not directly with the highest frame delay level you can achieve with a game, it is rather in over how much more display resolution you can achieve it, also with post-processing on top adding its weight, without the drawbacks (tearing and vsync_offset)

Too bad I don't have a higher-resolution display right now to make my testing more meaningful.
Well...I have a 1600x1200p in storage, would that make much of a perceptible defference in a stress test ?
« Last Edit: February 08, 2019, 10:44:48 am by schmerzkaufen »
GroovyMAME oddball LCD user: W7 64, viewsonic vx3211-mh, i5-4690k @4.1GHz, Rx 570, crt_emudriver 2.0b15