Main Restorations Software Audio/Jukebox/MP3 Everything Else Buy/Sell/Trade
Project Announcements Monitor/Video GroovyMAME Merit/JVL Touchscreen Meet Up Retail Vendors
Driving & Racing Woodworking Software Support Forums Consoles Project Arcade Reviews
Automated Projects Artwork Frontend Support Forums Pinball Forum Discussion Old Boards
Raspberry Pi & Dev Board controls.dat Linux Miscellaneous Arcade Wiki Discussion Old Archives
Lightguns Arcade1Up Try the site in https mode Site News

Unread posts | New Replies | Recent posts | Rules | Chatroom | Wiki | File Repository | RSS | Submit news

  

Author Topic: Vertical Refresh Rate: set by game software or monitor hardware?  (Read 5497 times)

0 Members and 1 Guest are viewing this topic.

torino

  • -Banned-
  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 201
  • Last login:July 24, 2011, 05:18:12 pm
  • -Banned-
Game: Galaga
Resolution: 224x288
Frequency: 60.606061Hz

VSync, the thing we need for smooth scrolling and steady animation. As far as I know it is the game software that waits for monitor hardware to say when is the time to update video buffer, not the other way around. So how did they get those numbers? In some manual, maybe specification sheet, or did they actually measure it with some instruments? But what would they measure, monitor properties or some PCB output signal? Is vertical refresh rate then a property of a given monitor, or of the game?

What happens when we plug Galaga PCB to arcade monitor with fixed 60Hz?


Thank you.

ahofle

  • Trade Count: (+1)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 4544
  • Last login:August 30, 2023, 05:10:22 pm
    • Arcade Ambience Project
Re: Vertical Refresh Rate: set by game software or monitor hardware?
« Reply #1 on: March 27, 2011, 04:53:07 pm »
The PCB will drive the appropriate resolution/refresh of the arcade monitor -- it doesn't wait for any feedback from the monitor.  Also, there is no 'fixed' 60hz on a true CRT monitor, but rather a range of values that define the supported frequency/range of the monitor.

torino

  • -Banned-
  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 201
  • Last login:July 24, 2011, 05:18:12 pm
  • -Banned-
Re: Vertical Refresh Rate: set by game software or monitor hardware?
« Reply #2 on: March 27, 2011, 05:52:16 pm »
The PCB will drive the appropriate resolution/refresh of the arcade monitor -- it doesn't wait for any feedback from the monitor.  Also, there is no 'fixed' 60hz on a true CRT monitor, but rather a range of values that define the supported frequency/range of the monitor.

How exactly do you propose PCB can "drive" monitor refresh rate? Do you think vertical refresh can vary from frame to frame, so when the game software slows down it would accordingly slow down monitor refresh rate? Of course not, and of course the software (good one) waits for VSync signal, otherwise we get choppiness and tearing.

Yes, CRTs seem to be flexible, even TVs can take quite a range, but what about old CGA PC/Amiga monitors, they had 15kHz horizontal and fixed 60Hz vertical refresh, I think. Or how about LCD arcade monitors, what's their refresh rate and what would happen if we plug Galaga PCB to such monitor? In any case the question remains - how did they obtain that number?

« Last Edit: March 27, 2011, 06:08:57 pm by torino »

ahofle

  • Trade Count: (+1)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 4544
  • Last login:August 30, 2023, 05:10:22 pm
    • Arcade Ambience Project
Re: Vertical Refresh Rate: set by game software or monitor hardware?
« Reply #3 on: March 28, 2011, 12:32:36 am »
The PCB will drive the appropriate resolution/refresh of the arcade monitor -- it doesn't wait for any feedback from the monitor.  Also, there is no 'fixed' 60hz on a true CRT monitor, but rather a range of values that define the supported frequency/range of the monitor.

How exactly do you propose PCB can "drive" monitor refresh rate? Do you think vertical refresh can vary from frame to frame, so when the game software slows down it would accordingly slow down monitor refresh rate? Of course not, and of course the software (good one) waits for VSync signal, otherwise we get choppiness and tearing.

It "drives" it the same way Windows "drives" it when you select a particular resolution.  That's what the sync wire(s) in a VGA cable are for.  I didn't say anything about varying the sync from frame to frame, not sure where you pulled that from.  Wait for vsync just means that the application will match its internal frame rate to the exact sync of the monitor, which is a known value by the application or PCB.  The game is not "listening" for anything from the monitor.  You are suggesting that the application or game will stop running if you disconnect the monitor because it then can't get the vsync "signal".   ???

I don't know how the vertical refresh rate for Galaga was determined, but it is just one variable in a fairly complex function.  Read up on modelines for more information. 

SavannahLion

  • Wiki Contributor
  • Trade Count: (+1)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 5986
  • Last login:December 19, 2015, 02:28:15 am
Re: Vertical Refresh Rate: set by game software or monitor hardware?
« Reply #4 on: March 28, 2011, 01:41:35 am »
ahofle, I may be wrong here, but it sounds like the OP is attempting to apply modern architecture concepts to the old hardware. I mean... feedback from the monitor?

To the OP.

Arcade monitors, heck, most any monitor from that era don't have any "feedback" on the state of the monitor. Either the signal gets there or it doesn't. If it doesn't, it's called playing blind. The frequency rate is something that is/was established by the industry as a whole. There is a ton of history behind the design of the CRT and the how and why countries like the U.S. chose 60Hz whereas other countries use 50Hz. Why computers up to a specific speed never used a video card (hint: check the history on the South Bridge). Why buffers aren't used on some boards. So on and so forth.

The short answer is the developers already know the frequency rate of the monitor going into the design of any game. Remember, CRT (and later LED/LCD/Plasma/whatever) manufacturers don't cater to just arcade cabinets. Those tubes and panels are manufactured for a wide range of applications.

torino

  • -Banned-
  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 201
  • Last login:July 24, 2011, 05:18:12 pm
  • -Banned-
Re: Vertical Refresh Rate: set by game software or monitor hardware?
« Reply #5 on: March 28, 2011, 01:58:10 am »
The PCB will drive the appropriate resolution/refresh of the arcade monitor -- it doesn't wait for any feedback from the monitor.  Also, there is no 'fixed' 60hz on a true CRT monitor, but rather a range of values that define the supported frequency/range of the monitor.

How exactly do you propose PCB can "drive" monitor refresh rate? Do you think vertical refresh can vary from frame to frame, so when the game software slows down it would accordingly slow down monitor refresh rate? Of course not, and of course the software (good one) waits for VSync signal, otherwise we get choppiness and tearing.

It "drives" it the same way Windows "drives" it when you select a particular resolution.  That's what the sync wire(s) in a VGA cable are for.  I didn't say anything about varying the sync from frame to frame, not sure where you pulled that from.  Wait for vsync just means that the application will match its internal frame rate to the exact sync of the monitor, which is a known value by the application or PCB.  The game is not "listening" for anything from the monitor.  You are suggesting that the application or game will stop running if you disconnect the monitor because it then can't get the vsync "signal".   ???

I don't know how the vertical refresh rate for Galaga was determined, but it is just one variable in a fairly complex function.  Read up on modelines for more information.  

I realize my mistake was to say "monitor hardware" could influence workings of the PCB while the RGB+Vsync lines are output only. The signal goes just in one direction, from PCB to monitor, and so indeed you are correct to say there is no any feedback coming from the monitor itself. However, there is a third thing standing in between - the video adapter.

So, game software boots up initialising video adapter and by setting resolution size in pixels width and height it practically adjusts adapter's dotclock. From this point on it's video adapter dotclock timing that will drive monitor refresh rate and in retrun it will ultimately also have this information about Vblank signal available for software to sync itself with.


Still, how did they obtain that number, what is the error of that measurement?

What "224x288" refers to, visible pixels or total pixels? And what if Galaga is meant to be running on 60Hz round, what if those numbers behind decimal point are actually just result of some imperfections of that particular measurement and that specific board, maybe due to age or heat. It would be really silly that we now can not properly sync this game with our fixed 60Hz LCDs just because some measurement error, wouldn't it?
« Last Edit: March 28, 2011, 02:14:34 am by torino »

torino

  • -Banned-
  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 201
  • Last login:July 24, 2011, 05:18:12 pm
  • -Banned-
Re: Vertical Refresh Rate: set by game software or monitor hardware?
« Reply #6 on: March 28, 2011, 02:12:11 am »
ahofle, I may be wrong here, but it sounds like the OP is attempting to apply modern architecture concepts to the old hardware. I mean... feedback from the monitor?

To the OP.

Arcade monitors, heck, most any monitor from that era don't have any "feedback" on the state of the monitor. Either the signal gets there or it doesn't. If it doesn't, it's called playing blind. The frequency rate is something that is/was established by the industry as a whole. There is a ton of history behind the design of the CRT and the how and why countries like the U.S. chose 60Hz whereas other countries use 50Hz. Why computers up to a specific speed never used a video card (hint: check the history on the South Bridge). Why buffers aren't used on some boards. So on and so forth.

The short answer is the developers already know the frequency rate of the monitor going into the design of any game. Remember, CRT (and later LED/LCD/Plasma/whatever) manufacturers don't cater to just arcade cabinets. Those tubes and panels are manufactured for a wide range of applications.

What do you mean "developers already know"? How? Why are you talking about "frequency rate of the monitor" when that's irrelevant having CRTs range could cover all sorts of resolutions and range of refresh rates? I'm developer too and I have several game PCBs, but I don't already know, so please tell me do I read it from somewhere or do I measure something, what exactly?

MonMotha

  • Trade Count: (+2)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 2378
  • Last login:February 19, 2018, 05:45:54 pm
Re: Vertical Refresh Rate: set by game software or monitor hardware?
« Reply #7 on: March 28, 2011, 02:30:55 am »
The monitor is a pure slave device.  It will accept frames at whatever rate the video source sends them as long as it is within its capabilities.  Depending on the monitor design, it may attempt to convert the input frame rate (by duplicating/dropping frames or some more complicated method) to something else.  Fully analog video paths are not capable of this, and this describes the vast majority of multisync and standard definition CRT monitors and TVs (including those with digital OSD and such - the video input path is still analog in most designs).  High-end LCDs and plasmas are also capable of driving the panel at somewhat arbitrary refresh rates, though their range is smaller than high-end multisync CRTs.  Low-end LCD and plasmas usually have a very limited range floating right around 60Hz and will have to convert everything to something in that range.

CRT arcade monitors are usually capable of syncing between 47-63Hz.  Adjustment to a "VHOLD" or "VFREQ" or "50/60Hz" control may be required as this rate is varied on some monitors.  Some, especially newer multisyncs, can go quite a bit higher.  Some arcade games ran at oddball frequencies.  Mortal Kombat II, for example, is about 53Hz.


Now, that handles the monitor.  Further complicating the situation is the PC hardware and software.

On Windows, it's damn near impossible to actually ask for and get an exact set of video timings, even if you know exactly what you want.  The interface to ask for anything more specific than "visible resolution + refresh rate" is driver specific and often requires a reboot!  Even then, most drivers will give you something close and call it good enough.  This makes attempting to time things based on the vertical refresh rate darn near impossible.  Some arcade games have made this mistake when moving from dedicated or console-based platforms, where such timing behavior is common, to PCs.

On Linux, you can certainly ask for an exact set of video timings using XRANDR or as a mode in the X configuration file, but what you'll get again depends on the driver.  Usually it's close, but it won't be exact.  It'll be whatever is closest that it's possible to program the PLL on the video card to.


The software also has to know how to ask.  MAME does not ask for exact video timings in any situation, at least not that I'm aware of.  MAME asks for a visible resolution and approximate framerate (IIRC, Windows is limited to specifying integer framerates, but I could be wrong on this).  Sometimes, it would be possible to use this information to work out exactly what game MAME is playing and set the video hardware to the closest possible setting, but sometimes this information is ambiguous, and again, the video hardware may not be able to hit EXACTLY what the original game expects.  The ArcadeVGA, for example, comes pre-programmed with a list of such pairs ("what MAME asks for" vs. "what you get in an attempt to match what MAME really wants") for common games.

I don't know enough about the internals of MAME to know how it handles any remaining discrepancy when attempting to output "native" video.  In theory, it would be possible to back-time the entire emulator based on the actually-obtained vertical refresh rate (the remaining mismatch likely being imperceptible), but I don't know if it does this, and it would slightly compromise the accuracy of the game timing, though probably not any worse than other issues surrounding timers on a preemptive, multitasking OS running on a general-purpose PC.  If it does not do this, which I don't think it does, the visual behavior will depend on your output settings (wait for vsync, triple buffer, etc.).  If wait for vsync is enabled, a frame will (very) occasionally be dropped to compensate for the slight difference in refresh rates between the emulated machine and the actual video output.  If triple buffering is also enabled, then a frame might occasionally be duplicated, instead, if the output runs faster than the emulation. If wait for vsync is disabled and triple buffering is disabled, then a tear line will (very) slowly scroll up or down the screen.  This might happen so slowly that it never makes it out of the inactive video region while you're playing, if the two rates are very close.


In general, it's considered poor design for a PC application to require a specific set of video timings for proper operation.  The preferred method is to draw frames as fast as possible, or as fast as the output is going, waiting for vsync, and to time the application itself off a more "consistent" timer facility.  Essentially, one should decouple the video timings from the timings of the application as a whole.  This actually goes for other hardware-generated timings, too, such as audio sample clocks.  Unfortunately, emulation of arcade games results in video (and audio) running at exactly a specific set of timings, so this is unavoidable.


If you want to measure the exact timings used by a game PCB, there are a few ways to do it.  The easiest method is to use an instrument commonly known as a "frequency counter" on the hsync and vsync lines.  This will give you the exact frequency of that signal accurate to whatever the instrument is capable of (generally 0.1Hz or better).  Measuring hsync+vsync rates and knowing the active video resolution, which you can usually infer from a little reverse engineering of the software, gives you enough information to fully emulate the game (the only missing information is the distribution of the time between the front and back porches, which just determines the centering of the video on the monitor).  Of course, sometimes you can just ask the former game developers for this information :)

torino

  • -Banned-
  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 201
  • Last login:July 24, 2011, 05:18:12 pm
  • -Banned-
Re: Vertical Refresh Rate: set by game software or monitor hardware?
« Reply #8 on: March 28, 2011, 06:17:38 am »
The monitor is a pure slave device.  It will accept frames at whatever rate the video source sends them as long as it is within its capabilities.  Depending on the monitor design, it may attempt to convert the input frame rate (by duplicating/dropping frames or some more complicated method) to something else.  Fully analog video paths are not capable of this, and this describes the vast majority of multisync and standard definition CRT monitors and TVs (including those with digital OSD and such - the video input path is still analog in most designs).  High-end LCDs and plasmas are also capable of driving the panel at somewhat arbitrary refresh rates, though their range is smaller than high-end multisync CRTs.  Low-end LCD and plasmas usually have a very limited range floating right around 60Hz and will have to convert everything to something in that range.

Yes, so that's only true for LCD/plasma type of monitors. They have truly fixed frequencies so they have to do this 'digital processing', which is very artificial and not good looking way to convert refresh rates. It's only intended for PAL/NTSC conversion where there is +/-10 frames to play with each second and interpolate/extrapolate digitally, but they are not designed to handle small offsets like Galaga's +0.6060, unless they can truly adjust their refresh rate as opposed to "convert".

On the other hand we have 'pure analog' CRTs, however you are making a wrong turn by expecting them to do this same digital thing when they are actually in much better position to start with, they do not even need or should ever to attempt it, they are "naturally", by their design principle, either able to sync or not, and that's it.

My 50Hz SCART TV can sync to 60Hz, and anything in between and around, it's only that picture shrinks as it gets close to 60Hz and some lines pop out from the top of the screen, but the picture is there along with its authentic vertical refresh rate. Yes, some CRTs don't have such flexibility, but that is moving us away from the point. And the point is that this problem with emulating arcade games vertical refresh properly only came around with LCDs, as even PC CRTs could allow for some flexibility in frequency, I think, so at least theoretically most of them could drive Galaga 60.606061Hz without any additional converters, upscalers or whatever, in contrast to LCDs which have truly fixed rate and so they can't.


Quote
CRT arcade monitors are usually capable of syncing between 47-63Hz.  Adjustment to a "VHOLD" or "VFREQ" or "50/60Hz" control may be required as this rate is varied on some monitors.  Some, especially newer multisyncs, can go quite a bit higher.  Some arcade games ran at oddball frequencies.  Mortal Kombat II, for example, is about 53Hz.

Yes, but if MAME is not able to set those 15kHz resolutions, or if you do not have one, that's of no use.

To be precise, I am talking only about one specific case, which is to Gagala PCB (or MAME) + fixed 60Hz LCD monitor.


Quote
Now, that handles the monitor.  Further complicating the situation is the PC hardware and software.

On Windows, it's damn near impossible to actually ask for and get an exact set of video timings, even if you know exactly what you want.  The interface to ask for anything more specific than "visible resolution + refresh rate" is driver specific and often requires a reboot!  Even then, most drivers will give you something close and call it good enough.  This makes attempting to time things based on the vertical refresh rate darn near impossible.  Some arcade games have made this mistake when moving from dedicated or console-based platforms, where such timing behavior is common, to PCs.

On Linux, you can certainly ask for an exact set of video timings using XRANDR or as a mode in the X configuration file, but what you'll get again depends on the driver.  Usually it's close, but it won't be exact.  It'll be whatever is closest that it's possible to program the PLL on the video card to.


The software also has to know how to ask.  MAME does not ask for exact video timings in any situation, at least not that I'm aware of.  MAME asks for a visible resolution and approximate framerate (IIRC, Windows is limited to specifying integer framerates, but I could be wrong on this).  Sometimes, it would be possible to use this information to work out exactly what game MAME is playing and set the video hardware to the closest possible setting, but sometimes this information is ambiguous, and again, the video hardware may not be able to hit EXACTLY what the original game expects.  The ArcadeVGA, for example, comes pre-programmed with a list of such pairs ("what MAME asks for" vs. "what you get in an attempt to match what MAME really wants") for common games.

I don't know enough about the internals of MAME to know how it handles any remaining discrepancy when attempting to output "native" video.  In theory, it would be possible to back-time the entire emulator based on the actually-obtained vertical refresh rate (the remaining mismatch likely being imperceptible), but I don't know if it does this, and it would slightly compromise the accuracy of the game timing, though probably not any worse than other issues surrounding timers on a preemptive, multitasking OS running on a general-purpose PC.  If it does not do this, which I don't think it does, the visual behavior will depend on your output settings (wait for vsync, triple buffer, etc.).  If wait for vsync is enabled, a frame will (very) occasionally be dropped to compensate for the slight difference in refresh rates between the emulated machine and the actual video output.  If triple buffering is also enabled, then a frame might occasionally be duplicated, instead, if the output runs faster than the emulation. If wait for vsync is disabled and triple buffering is disabled, then a tear line will (very) slowly scroll up or down the screen.  This might happen so slowly that it never makes it out of the inactive video region while you're playing, if the two rates are very close.

Video card and software is irrelevant if monitor can't do it.

All software has to do is wait for vertical refresh rate signal, which is IRQ set by graphic adapter, however that will not work with emulators if the game is supposed to produces more or less frames per second than that of what monitor can manage, and even if the difference is just a small fraction you will still get the full ugliness of the effect. And so LCDs can only do authentic vertical refresh rate if the game's refresh rate is also 60Hz exactly, or by forcing the game to have 60Hz update by slowing it down or speeding up to 60Hz, which is the only good looking and proper way to do away with LCDs. Anyhow else trying to "fix" this situation is just as artificial and ugly hack as digitally interpolating/extrapolating frames that makes non-existing frames out of thin air or deletes existing ones and tries to arrange what's left of the original frames evenly over time.


Quote
In general, it's considered poor design for a PC application to require a specific set of video timings for proper operation.  The preferred method is to draw frames as fast as possible, or as fast as the output is going, waiting for vsync, and to time the application itself off a more "consistent" timer facility.  Essentially, one should decouple the video timings from the timings of the application as a whole.  This actually goes for other hardware-generated timings, too, such as audio sample clocks.  Unfortunately, emulation of arcade games results in video (and audio) running at exactly a specific set of timings, so this is unavoidable.

That's so very wrong. Where did you get that?

What exactly do you think is "poor" about having fluid animation and smooth scrolling? It's called "perfect", not "poor". And what benefits do you imagine are there in processing frames as fast as possible if many will never even get to be displayed on the screen at all, plus you will still have tearing and choppiness? What can possibly be more "consistent" timer facility in relation to computer animation, and what would that help you if you don't sync to monitor refresh at the end?

It's the first lesson of computer animation to sync everything to monitor refresh rate. That's the preferred method, of course, and there is absolutely no reason for doing otherwise, especially considering all the ugly effects you are bound to get. So please, where did you hear anything like that, how did you come about to state what you just said?


Quote
If you want to measure the exact timings used by a game PCB, there are a few ways to do it.  The easiest method is to use an instrument commonly known as a "frequency counter" on the hsync and vsync lines.  This will give you the exact frequency of that signal accurate to whatever the instrument is capable of (generally 0.1Hz or better).  Measuring hsync+vsync rates and knowing the active video resolution, which you can usually infer from a little reverse engineering of the software, gives you enough information to fully emulate the game (the only missing information is the distribution of the time between the front and back porches, which just determines the centering of the video on the monitor).  Of course, sometimes you can just ask the former game developers for this information :)

Ok, thank you. So now we only need to establish what 224x288" refers to, visible pixels or total pixels?

Calamity

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 7463
  • Last login:July 19, 2025, 04:03:33 am
  • Quote me with care
Re: Vertical Refresh Rate: set by game software or monitor hardware?
« Reply #9 on: March 28, 2011, 07:47:44 am »
Consider this random game, "motos", that has it's xml data complete, let's calculate it's vfreq:

      <chip type="cpu" tag="maincpu" name="M6809" clock="1536000"/>
      <chip type="cpu" tag="sub" name="M6809" clock="1536000"/>
      <chip type="audio" tag="namco" name="Namco 15XX" clock="24000"/>
      <display type="raster" rotate="90" width="288" height="224" refresh="60.606061" pixclock="6144000" htotal="384" hbend="0" hbstart="288" vtotal="264" vbend="0" vbstart="224" />

So you have a visible resolution of 288 x 244, but the total resolution is what matters:

htotal x vtotal = 384 x 264 = 101376 total pixels per frame

Now we need the dotclock. This one is probably an integer multiple of the cpu master clock, here:

dotclock = 4 x master_clock = 4 x 1536000 = 6144000 Hz (this was just to explain where the pixclock value in xml data comes from)

Now we calculate vfreq:

vfreq = 6144000 / 101376 = 60,60606061 Hz

- htotal and vtotal can be obtained by reverse engineering the pcb hardware.
- master clock frequency too.

That's the theoretical vfreq. Then of course you can plug an oscilloscope to the vsync output and check for the actual value obtained.
Important note: posts reporting GM issues without a log will be IGNORED.
Steps to create a log:
 - From command line, run: groovymame.exe -v romname >romname.txt
 - Attach resulting romname.txt file to your post, instead of pasting it.

CRT Emudriver, VMMaker & Arcade OSD downloads, documentation and discussion:  Eiusdemmodi

MonMotha

  • Trade Count: (+2)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 2378
  • Last login:February 19, 2018, 05:45:54 pm
Re: Vertical Refresh Rate: set by game software or monitor hardware?
« Reply #10 on: March 28, 2011, 08:06:30 am »
Gah, I had a nice, long reply to this typed out, but smurfing SMF ate it when I accidentally hit the back button.

Essentially what you need to know is this: your impressions of LCD monitors being "purely fixed frame rate" are largely wrong.  Just like "fixed rate" CRTs, LCDs will accept a certain range of input refresh rates and WILL ACTUALLY REFRESH THE DISPLAY AT THAT RATE.  The exact range depends on the design of the monitor and are usually further limited by the panel and its accompanying driving electronics.  Common ranges are 47-63, 57-75, and 47-75Hz, but others of course exist.  What your monitor will do when presented with something outside this range depends on what the designer told it to do: some will attempt to convert while others will give an "out of range" error.  Since all LCD monitors have scalers, anyway (since they ARE fixed pixel resolution devices), it's usually not terribly hard to include frame rate conversion, but it simplifies the scaler chain substantially to avoid it.

Of course, there may exist some brain-dead monitor designs out there that attempt to convert all inputs to the monitor's idea of its "frame rate".  I don't know WHY you'd do it that way, but it's of course possible.  I've never seen one.  If you have one, I'd suggest defenestrating it.


As to proper PC programming practice, I think you misunderstood me.  I did not mean that one should simply render frames as whatever rate and "do nothing" (which results in tearing) to display them.  One needs to be aware that there WILL be a discrepancy between what framerate one asks for and what one actually gets (it will probably be fairly small, <1Hz, but it will be there).  If one is rendering stuff on the fly, it's not hard to make sure that a single frame is always ready when the user's hardware needs a new one.  Waiting for vsync or using triple buffering will do just that and avoid tearing artifacts.

Indeed, this is probably one of the earlier lessons in a computer graphics curriculum.

What I meant was that using the actual refresh rate (whatever it may be) as an indicator of elapsed "wall time" (based on your requested framerate) is a BAD idea on a PC.  Suppose you have a pre-rendered animation of a clock that is rendered at 60FPS.  If you just blindly assume that you should have a 1:1 mapping between source frames and output frames, your animation will eventually show the wrong time.  You somehow need to handle the small difference between your "perfect 60FPS" notion in the input material (which is fine, since you have to pick SOMETHING for pre-rendered material) and the actual output frame rate.  Usually, duplicating/dropping frames is fine.  It avoids messy interpolation and doesn't have any tearing issues.

Note that it's common practice to use the vsync rate on consoles and arcade game hardware as an indication of wall time.  It can simplify program design to do so, and consoles and arcade game boards can be carefully controlled and characterized to guarantee a certain output framerate.  The vast differences in PC hardware and driver configurations make this relatively impossible in a PC environment unless you have full end-to-end control over the hardware and software.


If MAME is requesting "244x288" from GDI or DirectX, then that's almost surely a "visible" or "active" pixel count.  You can measure this in hardware, too.  Measure the number of hsync pulses between vsync active edges (you have to use a single edge and not the pulse since there are a few lines during vsync, fortunately most counters support this).

torino

  • -Banned-
  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 201
  • Last login:July 24, 2011, 05:18:12 pm
  • -Banned-
Re: Vertical Refresh Rate: set by game software or monitor hardware?
« Reply #11 on: March 28, 2011, 09:40:22 am »
Gah, I had a nice, long reply to this typed out, but smurfing SMF ate it when I accidentally hit the back button.

Essentially what you need to know is this: your impressions of LCD monitors being "purely fixed frame rate" are largely wrong.  Just like "fixed rate" CRTs, LCDs will accept a certain range of input refresh rates and WILL ACTUALLY REFRESH THE DISPLAY AT THAT RATE.  The exact range depends on the design of the monitor and are usually further limited by the panel and its accompanying driving electronics.  Common ranges are 47-63, 57-75, and 47-75Hz, but others of course exist.  What your monitor will do when presented with something outside this range depends on what the designer told it to do: some will attempt to convert while others will give an "out of range" error.  Since all LCD monitors have scalers, anyway (since they ARE fixed pixel resolution devices), it's usually not terribly hard to include frame rate conversion, but it simplifies the scaler chain substantially to avoid it.

Of course, there may exist some brain-dead monitor designs out there that attempt to convert all inputs to the monitor's idea of its "frame rate".  I don't know WHY you'd do it that way, but it's of course possible.  I've never seen one.  If you have one, I'd suggest defenestrating it.

I see, and I'm surprised to hear that. Ok, so we agree there is a big difference between "converting" and "adjusting" the frame rate, but still you say most LCDs can vary their vertical refresh rate around 60Hz like CRTs can? I'm convinced they can't and are either totally fixed at 60Hz or have PAL/NTSC converter built-in, which doesn't work with small differences like +0.60601Hz as that choppiness is too sparse for converter to smooth out without introducing considerable lag, all of which does work great with movies, but is not really too desirable in interactive application like games.

In any case, none of my laptops can properly sync to any of MAME games that are not EXACTLY 60Hz, and you say your LCD can? Can you please then make a photo of MAME running Galaga and LCD OSD screen showing 60.606061Hz, or something like that?


Quote
As to proper PC programming practice, I think you misunderstood me.  I did not mean that one should simply render frames as whatever rate and "do nothing" (which results in tearing) to display them.  One needs to be aware that there WILL be a discrepancy between what framerate one asks for and what one actually gets (it will probably be fairly small, <1Hz, but it will be there).  If one is rendering stuff on the fly, it's not hard to make sure that a single frame is always ready when the user's hardware needs a new one.  Waiting for vsync or using triple buffering will do just that and avoid tearing artifacts.

Indeed, this is probably one of the earlier lessons in a computer graphics curriculum.

What I meant was that using the actual refresh rate (whatever it may be) as an indicator of elapsed "wall time" (based on your requested framerate) is a BAD idea on a PC.  Suppose you have a pre-rendered animation of a clock that is rendered at 60FPS.  If you just blindly assume that you should have a 1:1 mapping between source frames and output frames, your animation will eventually show the wrong time.  You somehow need to handle the small difference between your "perfect 60FPS" notion in the input material (which is fine, since you have to pick SOMETHING for pre-rendered material) and the actual output frame rate.  Usually, duplicating/dropping frames is fine.  It avoids messy interpolation and doesn't have any tearing issues.

Note that it's common practice to use the vsync rate on consoles and arcade game hardware as an indication of wall time.  It can simplify program design to do so, and consoles and arcade game boards can be carefully controlled and characterized to guarantee a certain output framerate.  The vast differences in PC hardware and driver configurations make this relatively impossible in a PC environment unless you have full end-to-end control over the hardware and software.

What you are talking about I call "temporal resolution", or precision of motion processing, which may be important in physical simulation, but still needs to sync its graphical output with monitor if fluid animation of its visual representation is desired.


.   .   .   .   .   .   .   .   .   .   .   .   .   .   .   .  

...... ......................... ...................... .....



You are talking about density and quantity, but I'm talking about timing and regularity. Consider that even 25Hz of steady and uniform distributed frames in time will give you better and smoother animation than uneven 100Hz. Only doubles work "correctly", only scaling 30Hz to 60Hz (doubling) or say 50Hz to 25Hz (downscaling) is still somewhat authentic, or to say is "least noticeable" difference.


Quote
If MAME is requesting "244x288" from GDI or DirectX, then that's almost surely a "visible" or "active" pixel count.  You can measure this in hardware, too.  Measure the number of hsync pulses between vsync active edges (you have to use a single edge and not the pulse since there are a few lines during vsync, fortunately most counters support this).

EDITED.. ok, yes that sounds great, only I can't get any reading with my multi-meter, but if you are correct about most LCDs being able to vary their refresh rate around 60Hz and sync correctly with MAME's Galaga 60.606061Hz, then I just need to find a proper monitor and problem solved, thank you very much... unless you are wrong. :-)
« Last Edit: March 28, 2011, 09:44:37 pm by torino »

torino

  • -Banned-
  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 201
  • Last login:July 24, 2011, 05:18:12 pm
  • -Banned-
Re: Vertical Refresh Rate: set by game software or monitor hardware?
« Reply #12 on: March 28, 2011, 09:52:27 am »
Consider this random game, "motos", that has it's xml data complete, let's calculate it's vfreq:

      <chip type="cpu" tag="maincpu" name="M6809" clock="1536000"/>
      <chip type="cpu" tag="sub" name="M6809" clock="1536000"/>
      <chip type="audio" tag="namco" name="Namco 15XX" clock="24000"/>
      <display type="raster" rotate="90" width="288" height="224" refresh="60.606061" pixclock="6144000" htotal="384" hbend="0" hbstart="288" vtotal="264" vbend="0" vbstart="224" />

So you have a visible resolution of 288 x 244, but the total resolution is what matters:

htotal x vtotal = 384 x 264 = 101376 total pixels per frame

Now we need the dotclock. This one is probably an integer multiple of the cpu master clock, here:

dotclock = 4 x master_clock = 4 x 1536000 = 6144000 Hz (this was just to explain where the pixclock value in xml data comes from)

Now we calculate vfreq:

vfreq = 6144000 / 101376 = 60,60606061 Hz

- htotal and vtotal can be obtained by reverse engineering the pcb hardware.
- master clock frequency too.

That's the theoretical vfreq. Then of course you can plug an oscilloscope to the vsync output and check for the actual value obtained.


Thank you, that's brilliant, it really is. I trust your mathematics, however the question remains - where did those numbers come from? We can plug in many different numbers in there and get all kinds of numbers as a result... EDIT: so I need damn oscilloscope, and I was hoping I could take some fancy multimeter from my dad's garage to do this, huh.

Why do you think CPU would work in multiples of refresh rate? Could it be their measurement was influenced by the regularity of video interrupt so it only _appeared to their instruments as if CPU works like that while it actually does not? Would you really be surprised if I tell you my Galaga PCB just so happens to work with my totally fixed 60Hz LCD monitor and OSD says it's being run at exactly 60.00Hz?
« Last Edit: March 28, 2011, 10:14:15 am by torino »

Calamity

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 7463
  • Last login:July 19, 2025, 04:03:33 am
  • Quote me with care
Re: Vertical Refresh Rate: set by game software or monitor hardware?
« Reply #13 on: March 28, 2011, 11:50:34 am »
where did those numbers come from?

Some may be guessed, some actually measured, I don't know how they're actually doing it, but I trust this information as the best one available, and in fact these values change sometimes as Mame is updated, probably due to more exact measurements being done.

We can plug in many different numbers in there and get all kinds of numbers as a result...

Well not so many indeed, horizontal values are actually 'characters' not pixels, so they are always multiples of 8. Master clock frequency is usually a known value. Total number of lines (vtotal) is actually measurable with an oscilloscope. Vertical refresh too. So you don't have so many possible combinations of htotal, vtotal, dotclock that can produce a measured vfreq.

Why do you think CPU would work in multiples of refresh rate?

No, I didn't say that. Refresh rate here is the final result. It's the dotclock the one that is probably an integer product of the master clock, applying a logic of simplicity, these machines probably had a master oscilator and the rest of frequencies may have been obtained by multiplying or dividing that master frequency by an integer value.

Could it be their measurement was influenced by the regularity of video interrupt so it only _appeared to their instruments as if CPU works like that while it actually does not? Would you really be surprised if I tell you my Galaga PCB just so happens to work with my totally fixed 60Hz LCD monitor and OSD says it's being run at exactly 60.00Hz?

Unfortunately I'm not a native English speaker so can't deal with irony here in the way I'm used to when using my language. Anyway, I have the feeling your style sounds familiar to me.
Important note: posts reporting GM issues without a log will be IGNORED.
Steps to create a log:
 - From command line, run: groovymame.exe -v romname >romname.txt
 - Attach resulting romname.txt file to your post, instead of pasting it.

CRT Emudriver, VMMaker & Arcade OSD downloads, documentation and discussion:  Eiusdemmodi

MonMotha

  • Trade Count: (+2)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 2378
  • Last login:February 19, 2018, 05:45:54 pm
Re: Vertical Refresh Rate: set by game software or monitor hardware?
« Reply #14 on: March 28, 2011, 03:25:54 pm »
If you honestly want this troubleshooted, you will need to define "sync up".  I'm suspecting a simple driver issue which may or may not be easily resolved.

I'd also suggest being less aggressive.  You're coming off as a bit trollish.  In fact, you remind me of a friendly forum troll we know as driver-man.  If that's not who you are, trust me when I say you don't want to be mistaken for this person.

torino

  • -Banned-
  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 201
  • Last login:July 24, 2011, 05:18:12 pm
  • -Banned-
Re: Vertical Refresh Rate: set by game software or monitor hardware?
« Reply #15 on: March 28, 2011, 09:46:05 pm »
where did those numbers come from?

Some may be guessed, some actually measured, I don't know how they're actually doing it, but I trust this information as the best one available, and in fact these values change sometimes as Mame is updated, probably due to more exact measurements being done.

We can plug in many different numbers in there and get all kinds of numbers as a result...

Well not so many indeed, horizontal values are actually 'characters' not pixels, so they are always multiples of 8. Master clock frequency is usually a known value. Total number of lines (vtotal) is actually measurable with an oscilloscope. Vertical refresh too. So you don't have so many possible combinations of htotal, vtotal, dotclock that can produce a measured vfreq.

Why do you think CPU would work in multiples of refresh rate?

No, I didn't say that. Refresh rate here is the final result. It's the dotclock the one that is probably an integer product of the master clock, applying a logic of simplicity, these machines probably had a master oscilator and the rest of frequencies may have been obtained by multiplying or dividing that master frequency by an integer value.

Could it be their measurement was influenced by the regularity of video interrupt so it only _appeared to their instruments as if CPU works like that while it actually does not? Would you really be surprised if I tell you my Galaga PCB just so happens to work with my totally fixed 60Hz LCD monitor and OSD says it's being run at exactly 60.00Hz?

Unfortunately I'm not a native English speaker so can't deal with irony here in the way I'm used to when using my language. Anyway, I have the feeling your style sounds familiar to me.

I believe my question is very reasonable and if you think it's not then please explain why do you think so. I also agree with what you said and I find all that reasonable too, but different people have different priorities, and curiosities, so I must ask:- what if Galaga is meant to be running at 60Hz round, what if those numbers behind decimal point are actually just result of some imperfections of that particular measurement and that specific board, maybe due to age or heat. It would be really silly that we now can not properly sync this game with our fixed 60Hz LCDs just because some measurement error, wouldn't it? And do you agree even PC CRTs have (had) enough flexibility so they could run at vertical 60.606061Hz while LCDs can not as they have truly fixed refresh rate?

torino

  • -Banned-
  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 201
  • Last login:July 24, 2011, 05:18:12 pm
  • -Banned-
Re: Vertical Refresh Rate: set by game software or monitor hardware?
« Reply #16 on: March 28, 2011, 09:53:25 pm »
If you honestly want this troubleshooted, you will need to define "sync up".  I'm suspecting a simple driver issue which may or may not be easily resolved.

I'd also suggest being less aggressive.  You're coming off as a bit trollish.  In fact, you remind me of a friendly forum troll we know as driver-man.  If that's not who you are, trust me when I say you don't want to be mistaken for this person.

I'm sorry, you lost me there. What "sync up", what driver are you talking about? Is there any way to use some fancy multimeter or maybe a computer to measure this stuff? Joystick port is analog and can take variable voltage in 5V range, so there must be some software that can turn a PC into oscilloscope, one way or another, right? Google here I come...


Have you realized by now LCDs can not update screen at variable range of refresh rates? Even in the case where they can take some range as input they would end up refreshing their screen at internal rate which usually is 60Hz, or perhaps 120Hz, but if they say it's 75Hz or 85Hz better watch out as your picture (motion) will be ruined since they would most likely just downgrade it to 60Hz and so you will end up with worse animation than if you were feeding 60Hz to start with.

Otherwise you seem to be well aware of this technology involved in processing, converting and displaying video streams of different formats, and I thank you for taking an interest and time to make this little chat, so the only other thing I can think of to complement your knowledge would be to point out to how it all started and where it all begun. If you already don't know I think you should find it interesting: http://en.wikipedia.org/wiki/Telecine

SavannahLion

  • Wiki Contributor
  • Trade Count: (+1)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 5986
  • Last login:December 19, 2015, 02:28:15 am
Re: Vertical Refresh Rate: set by game software or monitor hardware?
« Reply #17 on: March 29, 2011, 12:25:28 am »
I'd also suggest being less aggressive.  You're coming off as a bit trollish.  In fact, you remind me of a friendly forum troll we know as driver-man.  If that's not who you are, trust me when I say you don't want to be mistaken for this person.

I concur. I reached that conclusion pretty early.

SavannahLion

  • Wiki Contributor
  • Trade Count: (+1)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 5986
  • Last login:December 19, 2015, 02:28:15 am
Re: Vertical Refresh Rate: set by game software or monitor hardware?
« Reply #18 on: March 29, 2011, 12:54:56 am »
What your monitor will do when presented with something outside this range depends on what the designer told it to do: some will attempt to convert while others will give an "out of range" error.

Remember when there was no such safety mechanism? You had to be careful not to set the range too far out of whack.  ;D

Hoopz

  • Don't brand me a troublemaker!
  • Trade Count: (+8)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 5285
  • Last login:June 13, 2025, 09:18:32 pm
  • Intellivision Rocks!
Re: Vertical Refresh Rate: set by game software or monitor hardware?
« Reply #19 on: March 29, 2011, 06:50:34 am »
Reminds me more of Genesim than Driverman.  YMMV.

torino

  • -Banned-
  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 201
  • Last login:July 24, 2011, 05:18:12 pm
  • -Banned-
Re: Vertical Refresh Rate: set by game software or monitor hardware?
« Reply #20 on: March 29, 2011, 01:20:33 pm »
Reminds me more of Genesim than Driverman.  YMMV.

I think the topic at hand is far more interesting to talk about than friendly neighbourhood troll, amazing Driver-man dude or whatever you call him, but perhaps I should make it clear that unlike Genesim character, I dislike LCDs. However, while I mostly play the games on my cabinets with CRTs I still like to use emulators for checking out the other games and I simply can't stand to see things like scroll tearing and animation choppiness. I also know some people, for some strange reason, prefer LCDs and I think it's only fair they should also be aware of this very important, and less known, factor before they decide to ruin these games by unwisely choosing LCD over CRT.


And now to get back on topic, I got myself an oscilloscope! It works with sound card's microphone input that probably shares the same ADC chip with joystick port, but this is even easier to connect. Unlike many other software for USB port this one does not require any addition hardware but only microphone cable with standard audio (stereo for two axis input) jack, very cool!

http://www.zelscope.com/index.html

Does anyone want to join this testing? It would great if we could get independent measurements of the same game, right? I have these PCBs: Time Pilot, Popeye, Mr. Do!, Vs. Karate Champ, Yie Ar Kung-Fu, Green Beret, Kung-Fu Master and Rampart. My Galaga and Moon Patrol are broken, so I use MAME for these two. Anyway, is anyone interested? Which one shall we test?
 


Ladies and gentleman, the time has come to throw the dice. Please place your bets.


BLACK: Any measurements will give the exact same refresh rate frequency as MAME got - 60,60606061Hz

RED: Repeated measurements with taken average to minimise errors will point to almost exactly 60.00Hz

EVEN: All the measurements would be pretty consistent in one set-up, but the value may differ across different boards

ODD: You are troll, I hate you!
« Last Edit: March 29, 2011, 01:23:39 pm by torino »

Gray_Area

  • -Banned-
  • Trade Count: (+1)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 3363
  • Last login:June 23, 2013, 06:52:30 pm
  • -Banned-
Re: Vertical Refresh Rate: set by game software or monitor hardware?
« Reply #21 on: March 29, 2011, 10:40:07 pm »
Whatever the details of what and who, I'm surprised the first post wasn't something along the lines of, 'Read these', followed by links to 'How CRTs work', etc.
-Banned-