Main > Lightguns
GUN4IR - The Ultimate 4 Points Lightgun System
<< < (210/227) > >>
RandyT:
We are just talking past each other at this point, and your defenses seem to be in overdrive.  When a claim is made that X is as good or even better than Y relative to a certain specification, it should probably be expected that the claim will be called into question, and in cases where it doesn't seem plausible, the methodology used to come to that conclusion as well.  I've stated why, from a technical standpoint, the test you showed can't really affirm your claim.  You can accept that or not.

But as you stated, the "proof is in the pudding".  Obviously, the system has very good accuracy and quite possibly better than the original when dialed in, especially in the X direction where jitter is more of an issue with the original technology.  Interestingly, that increase in accuracy is precisely what can mask a possible difference in speed when the positions are averaged, as they are in that calibration screen.  Again, this is academic and related specifically to the claims you have made, and isn't intended to reflect poorly on the overall performance or quality of your system.

A more accurate lightgun system will trump a one or two frame difference in reporting speed all day long, for what good is more data if a larger percentage of it is incorrect?  Not much, and that doesn't seem to be the case with yours, so why so defensive?

Showing that a controller is as fast as the host system and software will ever allow is mind-numbingly simple, provided you have a few cycles to spare.  All it takes is adding a couple of lines of code to your firmware to toggle a spare pin, at the precise location where changes in your position calculations are "set in stone" and will be transferred, and then measuring the frequency of the signal on that pin with an oscilloscope.  So long as you aren't working from buffered data, it will paint a very clear picture as to timeliness of the updates.  If you are sending accurate updates 60 times a second and the host/software isn't able to act on those updates in near real-time, there's nothing you will ever be able to do about it.  In other words, even if it's not as timely as the original hardware and methodology, it's not on you, and the increase in accuracy will probably make up for it "in spades".
JayBee:

--- Quote from: RandyT on December 31, 2022, 06:45:02 pm ---We are just talking past each other at this point, and your defenses seem to be in overdrive.  When a claim is made that X is as good or even better than Y relative to a certain specification, it should probably be expected that the claim will be called into question, and in cases where it doesn't seem plausible, the methodology used to come to that conclusion as well.  I've stated why, from a technical standpoint, the test you showed can't really affirm your claim.  You can accept that or not.

--- End quote ---
You are the only one who actually don't agree with my methodology here, with your logic as broken and wrong as you think my methodology is.
Funny the way you always consider everything you say as THE absolute truth and clearly never consider you could ever be wrong. What makes you so sure you are right here?
In reality what you say is just your own opinion, just words on the internet, and has nothing else to show off, it has no more value than anybody else's opinion at that point.


--- Quote from: RandyT on December 31, 2022, 06:45:02 pm ---But as you stated, the "proof is in the pudding".  Obviously, the system has very good accuracy and quite possibly better than the original when dialed in, especially in the X direction where jitter is more of an issue with the original technology.  Interestingly, that increase in accuracy is precisely what can mask a possible difference in speed when the positions are averaged, as they are in that calibration screen.  Again, this is academic and related specifically to the claims you have made, and isn't intended to reflect poorly on the overall performance or quality of your system.

--- End quote ---
No, the CRT guncon at that distance does not exhibit more jittery than my gun, and no, averaging the aim won't make any significative difference on FAST motion. It's very basic math bud ;)


--- Quote from: RandyT on December 31, 2022, 06:45:02 pm ---A more accurate lightgun system will trump a one or two frame difference in reporting speed all day long, for what good is more data if a larger percentage of it is incorrect?  Not much, and that doesn't seem to be the case with yours, so why so defensive?

--- End quote ---
Again, guncon and gun4ir are equally accurate in that setup, and no, it doesn't have any incidence on the speed of fast motions. For instance a jittery of 2~3 pixels (which is about what we get on the guns at that distance) is simply changing nothing when the gun moves at a speed of 20 pixel or more on each frame over the course of several frames. If one of the guns had a constant extra latency of one frame, in that theorical case it would always drag 20+ pixel behind what the other gun does, and be clearly visible in the video.
Averaging wouldn't hide that, as during the motion the lag would propagate to every single position.
When every single position data has the same offset, averaging them will NOT remove that offset, that's the opposite, it would actually show it more clearly and reliably without the jitter.
Again, very basic math. I can't believe I even have to explain that. Your point is totally irrelevent at best, very insulting at worse.


--- Quote from: RandyT on December 31, 2022, 06:45:02 pm ---Showing that a controller is as fast as the host system and software will ever allow is mind-numbingly simple, provided you have a few cycles to spare.  All it takes is adding a couple of lines of code to your firmware to toggle a spare pin, at the precise location where changes in your position calculations are "set in stone" and will be transferred, and then measuring the frequency of the signal on that pin with an oscilloscope.  So long as you aren't working from buffered data, it will paint a very clear picture as to timeliness of the updates.  If you are sending accurate updates 60 times a second and the host/software isn't able to act on those updates in near real-time, there's nothing you will ever be able to do about it.  In other words, even if it's not as timely as the original hardware and methodology, it's not on you, and the increase in accuracy will probably make up for it "in spades".

--- End quote ---
Once again you have no clue how things work, and what I wanted to show.
Your suggestion is a very basic and widely used method to measure digital state change like with button, but won't work here because you are completely ignoring the latency of the sensor itself, so even if you measure the speed at which the sensor or the gun send data (which I did already, it's ~4ms for the sensor itself, ~5ms for the sensor+the rest, over several thousand samples) in comparison to the speed the host hardware handles it, it won't give you squat for the total latency.
Your methodology here is completely useless in this context, and doesn't show anything valuable I don't already know.
Even if I measure the latency of the sensor itself, it wouldn't show much valuable information for my end users, as it wouldn't be a real world case.

What I am showing here is how the gun can match the native guncon on the MiSTer fpga PS1 core, nothing more, nothing less.
It's not supposed to be a very technical video, just showing the MiSTer users what they can expect from it in real world use, which is what it is all about here.
When I said the gun4ir was doing as good as CRT guns at their own game few messages ago, you said you'd be curious to see a video that shows it.
Which is what I clearly did. Now that you accept it as proof or not it's totally on you. I know my methodology and the result of it are sound, especially after all the research and consulting I did, no matter what you think you might know better.
Some of us just don't make claims without actually thoroughly studying and testing that claim by themselves ;)
And I accept any criticism, but only when they actually make sense in the context.

But you clearly won't change your mind anyway, so let's agree to disagree and keep it that way, I wasted way too much time on this already.
RandyT:
If you want to boil it down to what you stated, what you showed was that it keeps up in the averaged calibration screen, not in actual use.  I don't think that's a better way to think about it.  And the math you are using includes a lot of assumptions, like the error being a constant.  "Jitter' is by it's nature, specifically not that, rather a small error in timing. Timing errors can manifest themselves as both over-runs and under-runs.  The composite cable of GCs provide a tighter timing reference over earlier lightguns, but it's not digitally perfect, due to the analog parts and signals, and still subject to these timing errors. This would be much more apparent on a screen which is not averaging/oversampling the cursor position.  But I get that there aren't really any options in available software to view it in that form.

But I will admit that I do not know what MiSTer does with an original GC.  It could be very different from how it interacts with an original console.

As for the method I stated not being valuable, it was very specific that it included accurate data.  I.e. the latest data available from the camera module, so it's factored in.  I don't believe that there is currently a better sensor or method (reasonably) available for this type of controller, so what the camera module is doing is somewhat moot, so long as it is faster than 60 fps (which it easily is and then some) with enough margin left for processing overhead to meet the 60fps of the original control.  That total timeframe is exactly what would be measured at the point in which the data is to be transferred to the host.  Or do you believe otherwise?  If so, I'm interested to know why, because that's where the "rubber meets the road" and anything beyond that is not in your control.
JayBee:

--- Quote from: RandyT on January 01, 2023, 04:12:21 am ---If you want to boil it down to what you stated, what you showed was that it keeps up in the averaged calibration screen, not in actual use.  I don't think that's a better way to think about it.  And the math you are using includes a lot of assumptions, like the error being a constant.  "Jitter' is by it's nature, specifically not that, rather a small error in timing. Timing errors can manifest themselves as both over-runs and under-runs.  The composite cable of GCs provide a tighter timing reference over earlier lightguns, but it's not digitally perfect, due to the analog parts and signals, and still subject to these timing errors. This would be much more apparent on a screen which is not averaging/oversampling the cursor position.  But I get that there aren't really any options in available software to view it in that form.

As for the method I stated not being valuable, it was very specific that it included accurate data.  I.e. the latest data available from the camera module, so it's factored in.  I don't believe that there is currently a better sensor or method (reasonably) available for this type of controller, so what the camera module is doing is somewhat moot, so long as it is faster than 60 fps (which it easily is and then some) with enough margin left for processing overhead to meet the 60fps of the original control.  That total timeframe is exactly what would be measured at the point in which the data is to be transferred to the host.  Or do you believe otherwise?  If so, I'm interested to know why, because that's where the "rubber meets the road" and anything beyond that is not in your control.

--- End quote ---
I've taken all parameters into consideration and none of what you are saying here have any relevance or is any significant for the current case.
You are trying to find issues when there is none, just to desperately trying to make your point.
The analog sync and CRT gun sensor are precise down to a few micro seconds, and are very consistent, else it won't even work the way it does. The main cause of those jitter we get with these CRT guns isn't even the analog/digital imprecision. And again, for fast motion, in the course of several frame, none of the inaccuracies or errors are significative enough to actually cause issues and prevent a good analysis of the data. If it was the case it would also be very visible even with the limits of the hardware.

And the rest of your message again completely misses the point and the argument you are trying to make it totally empty in relation to the video and what it's showing.

But I'm done wasting my time arguing here, at that point you totally lost yourself in convoluted explanations that have no purpose or end.

Fine by me, but please stop wasting both of our time and polluting that thread, and go doing something else, more useful, like spending time with your friends or family for the new year celebrations.
RandyT:
Of course it has to be in the microseconds.  The raster is moving extremely fast and if the timing wasn't reasonably precise, the cursor would bounce around so much you probably wouldn't see it. But that doesn't mean it is without error, as it would be if a similar system were possible in the purely digital realm (which it is obviously not.)

But no matter, you are correct that neither of us will likely see eye-to-eye on this.  It doesn't truly matter anyway, as the current best approach to this problem on modern displays is the one you, and a couple of others have taken, and no other approaches compare.  At least in my opinion.  None are going to be a perfect analog to the original technology, nor does it need to be, so long as it works well.  Yours obviously must to have so many happy users.

Hope you have a nice New Years celebration. :)
Navigation
Message Index
Next page
Previous page

Go to full version