| Main > Main Forum |
| Vertical Refresh Rate: set by game software or monitor hardware? |
| << < (2/5) > >> |
| torino:
--- Quote from: ahofle on March 28, 2011, 12:32:36 am --- --- Quote from: torino on March 27, 2011, 05:52:16 pm --- --- Quote from: ahofle on March 27, 2011, 04:53:07 pm ---The PCB will drive the appropriate resolution/refresh of the arcade monitor -- it doesn't wait for any feedback from the monitor. Also, there is no 'fixed' 60hz on a true CRT monitor, but rather a range of values that define the supported frequency/range of the monitor. --- End quote --- How exactly do you propose PCB can "drive" monitor refresh rate? Do you think vertical refresh can vary from frame to frame, so when the game software slows down it would accordingly slow down monitor refresh rate? Of course not, and of course the software (good one) waits for VSync signal, otherwise we get choppiness and tearing. --- End quote --- It "drives" it the same way Windows "drives" it when you select a particular resolution. That's what the sync wire(s) in a VGA cable are for. I didn't say anything about varying the sync from frame to frame, not sure where you pulled that from. Wait for vsync just means that the application will match its internal frame rate to the exact sync of the monitor, which is a known value by the application or PCB. The game is not "listening" for anything from the monitor. You are suggesting that the application or game will stop running if you disconnect the monitor because it then can't get the vsync "signal". ??? I don't know how the vertical refresh rate for Galaga was determined, but it is just one variable in a fairly complex function. Read up on modelines for more information. --- End quote --- I realize my mistake was to say "monitor hardware" could influence workings of the PCB while the RGB+Vsync lines are output only. The signal goes just in one direction, from PCB to monitor, and so indeed you are correct to say there is no any feedback coming from the monitor itself. However, there is a third thing standing in between - the video adapter. So, game software boots up initialising video adapter and by setting resolution size in pixels width and height it practically adjusts adapter's dotclock. From this point on it's video adapter dotclock timing that will drive monitor refresh rate and in retrun it will ultimately also have this information about Vblank signal available for software to sync itself with. Still, how did they obtain that number, what is the error of that measurement? What "224x288" refers to, visible pixels or total pixels? And what if Galaga is meant to be running on 60Hz round, what if those numbers behind decimal point are actually just result of some imperfections of that particular measurement and that specific board, maybe due to age or heat. It would be really silly that we now can not properly sync this game with our fixed 60Hz LCDs just because some measurement error, wouldn't it? |
| torino:
--- Quote from: SavannahLion on March 28, 2011, 01:41:35 am ---ahofle, I may be wrong here, but it sounds like the OP is attempting to apply modern architecture concepts to the old hardware. I mean... feedback from the monitor? To the OP. Arcade monitors, heck, most any monitor from that era don't have any "feedback" on the state of the monitor. Either the signal gets there or it doesn't. If it doesn't, it's called playing blind. The frequency rate is something that is/was established by the industry as a whole. There is a ton of history behind the design of the CRT and the how and why countries like the U.S. chose 60Hz whereas other countries use 50Hz. Why computers up to a specific speed never used a video card (hint: check the history on the South Bridge). Why buffers aren't used on some boards. So on and so forth. The short answer is the developers already know the frequency rate of the monitor going into the design of any game. Remember, CRT (and later LED/LCD/Plasma/whatever) manufacturers don't cater to just arcade cabinets. Those tubes and panels are manufactured for a wide range of applications. --- End quote --- What do you mean "developers already know"? How? Why are you talking about "frequency rate of the monitor" when that's irrelevant having CRTs range could cover all sorts of resolutions and range of refresh rates? I'm developer too and I have several game PCBs, but I don't already know, so please tell me do I read it from somewhere or do I measure something, what exactly? |
| MonMotha:
The monitor is a pure slave device. It will accept frames at whatever rate the video source sends them as long as it is within its capabilities. Depending on the monitor design, it may attempt to convert the input frame rate (by duplicating/dropping frames or some more complicated method) to something else. Fully analog video paths are not capable of this, and this describes the vast majority of multisync and standard definition CRT monitors and TVs (including those with digital OSD and such - the video input path is still analog in most designs). High-end LCDs and plasmas are also capable of driving the panel at somewhat arbitrary refresh rates, though their range is smaller than high-end multisync CRTs. Low-end LCD and plasmas usually have a very limited range floating right around 60Hz and will have to convert everything to something in that range. CRT arcade monitors are usually capable of syncing between 47-63Hz. Adjustment to a "VHOLD" or "VFREQ" or "50/60Hz" control may be required as this rate is varied on some monitors. Some, especially newer multisyncs, can go quite a bit higher. Some arcade games ran at oddball frequencies. Mortal Kombat II, for example, is about 53Hz. Now, that handles the monitor. Further complicating the situation is the PC hardware and software. On Windows, it's damn near impossible to actually ask for and get an exact set of video timings, even if you know exactly what you want. The interface to ask for anything more specific than "visible resolution + refresh rate" is driver specific and often requires a reboot! Even then, most drivers will give you something close and call it good enough. This makes attempting to time things based on the vertical refresh rate darn near impossible. Some arcade games have made this mistake when moving from dedicated or console-based platforms, where such timing behavior is common, to PCs. On Linux, you can certainly ask for an exact set of video timings using XRANDR or as a mode in the X configuration file, but what you'll get again depends on the driver. Usually it's close, but it won't be exact. It'll be whatever is closest that it's possible to program the PLL on the video card to. The software also has to know how to ask. MAME does not ask for exact video timings in any situation, at least not that I'm aware of. MAME asks for a visible resolution and approximate framerate (IIRC, Windows is limited to specifying integer framerates, but I could be wrong on this). Sometimes, it would be possible to use this information to work out exactly what game MAME is playing and set the video hardware to the closest possible setting, but sometimes this information is ambiguous, and again, the video hardware may not be able to hit EXACTLY what the original game expects. The ArcadeVGA, for example, comes pre-programmed with a list of such pairs ("what MAME asks for" vs. "what you get in an attempt to match what MAME really wants") for common games. I don't know enough about the internals of MAME to know how it handles any remaining discrepancy when attempting to output "native" video. In theory, it would be possible to back-time the entire emulator based on the actually-obtained vertical refresh rate (the remaining mismatch likely being imperceptible), but I don't know if it does this, and it would slightly compromise the accuracy of the game timing, though probably not any worse than other issues surrounding timers on a preemptive, multitasking OS running on a general-purpose PC. If it does not do this, which I don't think it does, the visual behavior will depend on your output settings (wait for vsync, triple buffer, etc.). If wait for vsync is enabled, a frame will (very) occasionally be dropped to compensate for the slight difference in refresh rates between the emulated machine and the actual video output. If triple buffering is also enabled, then a frame might occasionally be duplicated, instead, if the output runs faster than the emulation. If wait for vsync is disabled and triple buffering is disabled, then a tear line will (very) slowly scroll up or down the screen. This might happen so slowly that it never makes it out of the inactive video region while you're playing, if the two rates are very close. In general, it's considered poor design for a PC application to require a specific set of video timings for proper operation. The preferred method is to draw frames as fast as possible, or as fast as the output is going, waiting for vsync, and to time the application itself off a more "consistent" timer facility. Essentially, one should decouple the video timings from the timings of the application as a whole. This actually goes for other hardware-generated timings, too, such as audio sample clocks. Unfortunately, emulation of arcade games results in video (and audio) running at exactly a specific set of timings, so this is unavoidable. If you want to measure the exact timings used by a game PCB, there are a few ways to do it. The easiest method is to use an instrument commonly known as a "frequency counter" on the hsync and vsync lines. This will give you the exact frequency of that signal accurate to whatever the instrument is capable of (generally 0.1Hz or better). Measuring hsync+vsync rates and knowing the active video resolution, which you can usually infer from a little reverse engineering of the software, gives you enough information to fully emulate the game (the only missing information is the distribution of the time between the front and back porches, which just determines the centering of the video on the monitor). Of course, sometimes you can just ask the former game developers for this information :) |
| torino:
--- Quote from: MonMotha on March 28, 2011, 02:30:55 am ---The monitor is a pure slave device. It will accept frames at whatever rate the video source sends them as long as it is within its capabilities. Depending on the monitor design, it may attempt to convert the input frame rate (by duplicating/dropping frames or some more complicated method) to something else. Fully analog video paths are not capable of this, and this describes the vast majority of multisync and standard definition CRT monitors and TVs (including those with digital OSD and such - the video input path is still analog in most designs). High-end LCDs and plasmas are also capable of driving the panel at somewhat arbitrary refresh rates, though their range is smaller than high-end multisync CRTs. Low-end LCD and plasmas usually have a very limited range floating right around 60Hz and will have to convert everything to something in that range. --- End quote --- Yes, so that's only true for LCD/plasma type of monitors. They have truly fixed frequencies so they have to do this 'digital processing', which is very artificial and not good looking way to convert refresh rates. It's only intended for PAL/NTSC conversion where there is +/-10 frames to play with each second and interpolate/extrapolate digitally, but they are not designed to handle small offsets like Galaga's +0.6060, unless they can truly adjust their refresh rate as opposed to "convert". On the other hand we have 'pure analog' CRTs, however you are making a wrong turn by expecting them to do this same digital thing when they are actually in much better position to start with, they do not even need or should ever to attempt it, they are "naturally", by their design principle, either able to sync or not, and that's it. My 50Hz SCART TV can sync to 60Hz, and anything in between and around, it's only that picture shrinks as it gets close to 60Hz and some lines pop out from the top of the screen, but the picture is there along with its authentic vertical refresh rate. Yes, some CRTs don't have such flexibility, but that is moving us away from the point. And the point is that this problem with emulating arcade games vertical refresh properly only came around with LCDs, as even PC CRTs could allow for some flexibility in frequency, I think, so at least theoretically most of them could drive Galaga 60.606061Hz without any additional converters, upscalers or whatever, in contrast to LCDs which have truly fixed rate and so they can't. --- Quote ---CRT arcade monitors are usually capable of syncing between 47-63Hz. Adjustment to a "VHOLD" or "VFREQ" or "50/60Hz" control may be required as this rate is varied on some monitors. Some, especially newer multisyncs, can go quite a bit higher. Some arcade games ran at oddball frequencies. Mortal Kombat II, for example, is about 53Hz. --- End quote --- Yes, but if MAME is not able to set those 15kHz resolutions, or if you do not have one, that's of no use. To be precise, I am talking only about one specific case, which is to Gagala PCB (or MAME) + fixed 60Hz LCD monitor. --- Quote ---Now, that handles the monitor. Further complicating the situation is the PC hardware and software. On Windows, it's damn near impossible to actually ask for and get an exact set of video timings, even if you know exactly what you want. The interface to ask for anything more specific than "visible resolution + refresh rate" is driver specific and often requires a reboot! Even then, most drivers will give you something close and call it good enough. This makes attempting to time things based on the vertical refresh rate darn near impossible. Some arcade games have made this mistake when moving from dedicated or console-based platforms, where such timing behavior is common, to PCs. On Linux, you can certainly ask for an exact set of video timings using XRANDR or as a mode in the X configuration file, but what you'll get again depends on the driver. Usually it's close, but it won't be exact. It'll be whatever is closest that it's possible to program the PLL on the video card to. The software also has to know how to ask. MAME does not ask for exact video timings in any situation, at least not that I'm aware of. MAME asks for a visible resolution and approximate framerate (IIRC, Windows is limited to specifying integer framerates, but I could be wrong on this). Sometimes, it would be possible to use this information to work out exactly what game MAME is playing and set the video hardware to the closest possible setting, but sometimes this information is ambiguous, and again, the video hardware may not be able to hit EXACTLY what the original game expects. The ArcadeVGA, for example, comes pre-programmed with a list of such pairs ("what MAME asks for" vs. "what you get in an attempt to match what MAME really wants") for common games. I don't know enough about the internals of MAME to know how it handles any remaining discrepancy when attempting to output "native" video. In theory, it would be possible to back-time the entire emulator based on the actually-obtained vertical refresh rate (the remaining mismatch likely being imperceptible), but I don't know if it does this, and it would slightly compromise the accuracy of the game timing, though probably not any worse than other issues surrounding timers on a preemptive, multitasking OS running on a general-purpose PC. If it does not do this, which I don't think it does, the visual behavior will depend on your output settings (wait for vsync, triple buffer, etc.). If wait for vsync is enabled, a frame will (very) occasionally be dropped to compensate for the slight difference in refresh rates between the emulated machine and the actual video output. If triple buffering is also enabled, then a frame might occasionally be duplicated, instead, if the output runs faster than the emulation. If wait for vsync is disabled and triple buffering is disabled, then a tear line will (very) slowly scroll up or down the screen. This might happen so slowly that it never makes it out of the inactive video region while you're playing, if the two rates are very close. --- End quote --- Video card and software is irrelevant if monitor can't do it. All software has to do is wait for vertical refresh rate signal, which is IRQ set by graphic adapter, however that will not work with emulators if the game is supposed to produces more or less frames per second than that of what monitor can manage, and even if the difference is just a small fraction you will still get the full ugliness of the effect. And so LCDs can only do authentic vertical refresh rate if the game's refresh rate is also 60Hz exactly, or by forcing the game to have 60Hz update by slowing it down or speeding up to 60Hz, which is the only good looking and proper way to do away with LCDs. Anyhow else trying to "fix" this situation is just as artificial and ugly hack as digitally interpolating/extrapolating frames that makes non-existing frames out of thin air or deletes existing ones and tries to arrange what's left of the original frames evenly over time. --- Quote ---In general, it's considered poor design for a PC application to require a specific set of video timings for proper operation. The preferred method is to draw frames as fast as possible, or as fast as the output is going, waiting for vsync, and to time the application itself off a more "consistent" timer facility. Essentially, one should decouple the video timings from the timings of the application as a whole. This actually goes for other hardware-generated timings, too, such as audio sample clocks. Unfortunately, emulation of arcade games results in video (and audio) running at exactly a specific set of timings, so this is unavoidable. --- End quote --- That's so very wrong. Where did you get that? What exactly do you think is "poor" about having fluid animation and smooth scrolling? It's called "perfect", not "poor". And what benefits do you imagine are there in processing frames as fast as possible if many will never even get to be displayed on the screen at all, plus you will still have tearing and choppiness? What can possibly be more "consistent" timer facility in relation to computer animation, and what would that help you if you don't sync to monitor refresh at the end? It's the first lesson of computer animation to sync everything to monitor refresh rate. That's the preferred method, of course, and there is absolutely no reason for doing otherwise, especially considering all the ugly effects you are bound to get. So please, where did you hear anything like that, how did you come about to state what you just said? --- Quote ---If you want to measure the exact timings used by a game PCB, there are a few ways to do it. The easiest method is to use an instrument commonly known as a "frequency counter" on the hsync and vsync lines. This will give you the exact frequency of that signal accurate to whatever the instrument is capable of (generally 0.1Hz or better). Measuring hsync+vsync rates and knowing the active video resolution, which you can usually infer from a little reverse engineering of the software, gives you enough information to fully emulate the game (the only missing information is the distribution of the time between the front and back porches, which just determines the centering of the video on the monitor). Of course, sometimes you can just ask the former game developers for this information :) --- End quote --- Ok, thank you. So now we only need to establish what 224x288" refers to, visible pixels or total pixels? |
| Calamity:
Consider this random game, "motos", that has it's xml data complete, let's calculate it's vfreq: <chip type="cpu" tag="maincpu" name="M6809" clock="1536000"/> <chip type="cpu" tag="sub" name="M6809" clock="1536000"/> <chip type="audio" tag="namco" name="Namco 15XX" clock="24000"/> <display type="raster" rotate="90" width="288" height="224" refresh="60.606061" pixclock="6144000" htotal="384" hbend="0" hbstart="288" vtotal="264" vbend="0" vbstart="224" /> So you have a visible resolution of 288 x 244, but the total resolution is what matters: htotal x vtotal = 384 x 264 = 101376 total pixels per frame Now we need the dotclock. This one is probably an integer multiple of the cpu master clock, here: dotclock = 4 x master_clock = 4 x 1536000 = 6144000 Hz (this was just to explain where the pixclock value in xml data comes from) Now we calculate vfreq: vfreq = 6144000 / 101376 = 60,60606061 Hz - htotal and vtotal can be obtained by reverse engineering the pcb hardware. - master clock frequency too. That's the theoretical vfreq. Then of course you can plug an oscilloscope to the vsync output and check for the actual value obtained. |
| Navigation |
| Message Index |
| Next page |
| Previous page |