Build Your Own Arcade Controls Forum
Software Support => GroovyMAME => Topic started by: formula409 on September 02, 2020, 04:36:19 am
-
Is it just me, or does it sound like someone at Nvidia saw GroovyMAME and decided to ripoff frame delay?
When developers integrate the Reflex SDK, they are able to effectively delay the sampling of input and game simulation by dynamically adjusting the submission timing of rendering work to the GPU so that they are processed just-in-time.
:laugh:
https://www.nvidia.com/en-us/geforce/news/reflex-low-latency-platform/ (https://www.nvidia.com/en-us/geforce/news/reflex-low-latency-platform/)
-
This is not a new feature actually. They seem to just have given it a new name.
-
This is not a new feature actually. They seem to just have given it a new name.
Can you elaborate? What was the old one?
-
I thought it was the (ultra) low latency mode, which they previously called "pre-rendered frames" or something (when set to 0/1). Are they really doing something else with this upcoming Reflex thing? Doesn't sound like it.
-
I thought it was the (ultra) low latency mode, which they previously called "pre-rendered frames" or something (when set to 0/1). Are they really doing something else with this upcoming Reflex thing? Doesn't sound like it.
Did you even read the article? It has nothing to do with the pre-rendered frames. Read it. It basically sounds like frame delay.
-
Just did, thanks, you're right. Pasting here the relevant lines so that nobody else has to:
In the above image, we can see that the queue is filled with frames. The CPU is processing frames faster than the GPU can render them causing this backup, resulting in an increase of render latency. The Reflex SDK shares some similarities with the Ultra Low Latency Mode in the driver; however, by integrating directly into the game, we are able to control the amount of back-pressure the CPU receives from the render queue and other later stages of the pipeline. While the Ultra Low Latency mode can often reduce the render queue, it can not remove the increased back-pressure on the game and CPU side. Thus, the latency benefits from the Reflex SDK are generally much better than the Ultra Low Latency mode in the driver.
When developers integrate the Reflex SDK, they are able to effectively delay the sampling of input and game simulation by dynamically adjusting the submission timing of rendering work to the GPU so that they are processed just-in-time.
Additionally, the SDK also offers a feature called Low Latency Boost. This feature overrides the power saving features in the GPU to allow the GPU clocks to stay high when heavily CPU-bound. Even when the game is CPU-bound, longer rendering times add latency. Keeping the clocks higher can consume significantly more power, but can reduce latency slightly when the GPU is significantly underutilized and the CPU submits the final rendering work in a large batch. Note that if you do not want the power tradeoff, you can use Reflex Low Latency mode without the Boost enabled.
-
Send 'em the lawyers Calamity! :D
They stole your ---steaming pile of meadow muffin---!
-
I thought it was the (ultra) low latency mode, which they previously called "pre-rendered frames" or something (when set to 0/1).
And that, sounded like a hard_gpu_sync ripoff (the option in retroarch).
Guess nvidia and amd aren't completely blind-deaf to the 'scene' if they can 'borrow' ideas from emulators and frontends devs.
And if there are no patents, well...