I see your wall of text and I raise you with my WALL OF TEXT! RAWR!
Even if you get 100% speed in MAME, if the original game didn't run at 60hz, it will stutter on a 60hz LCD. If V-Sync is on, you must be running at the game's native refresh rate for smooth video. Mortal Kombat runs at ~54.7hz, so "perfection" there is running at 54.7hz. There is nothing magical about 60hz. Most arcade games DON'T run at 60hz. You realize that there are a lot of games that run ABOVE 60hz, right? It's not possible to run these games at the correct speed on a 60hz monitor with V-Sync on. I love how you mention Pac-Man as an example of a game where this doesn't matter, but Pac-Man actually runs at 60.60hz. It's impossible to get it to run at the right speed with V-Sync on a 60hz monitor.
I don't run the original game and I don't output it with analog. I use an emulator that throttles the game so it can run on a processor way faster than the original, and outputs the results in a way that allows my computer to display them at a frequency and resolution that my monitor is compatible with. I run my mame at 1900x1080 @ 60 hz. And it works just fine for me. Very few games stutter and none of them tear (in mame). Demul 3D games stutter, but those are because the game is running below 100% and the 3d card isn't rendering fast enough to hit 60 fps.
You don't understand how V-Sync and tearing work. Tearing primarily occurs when you're outputting frames faster than your monitor's refresh rate. By the way, V-Sync introduces at least 1 additional frame of lag universally. It's inescapable. G-Sync has less input lag. It's objectively better.
You are confusing throttling with tearing. Tearing is when one frame is not fully drawn before the next monitor cycle because the graphics card has not fully rendered it, and you notice a visible line across your screen where two frames intersect. The frame is drawn by the graphics card, so if the source is slower than the monitor, that means you are UNDER the monitor frequency, not above it.
V-sync syncs your video card output to your monitor refresh. Anything above 60 frames per second is throttled, and while that means a dropped frame here and there, it is already rendering faster than the human eye can see, so you will never notice the dropped frame. And while you could argue that v-sync is slower, the difference is still below the threshold of human vision, so it is irrelevant. However, anything BELOW 60 fps will still potentially tear.
G-sync allows the monitor to change frequency and match the output of the card. There are two reasons this is better: 1) so you don't get tearing below 60 fps, and 2) so you don't get throttling (frame drops) above 60 hz.
I want to reiterate: If you are rendering above 60 fps, with v-sync the card will simply drop a frame, and no tear will occur. The dropped frame is technically not noticeable to the human eye because technically your brain can't sample more than 30 frames in a second. Some people claim they can see it, but then some people claim they can hear jitter on a laserdisc or dvd. From my experience, 99% of these people don't know jitter from distortion (or frame loss from screen tearing).
Give how close together your eyes are, I'm not even convinced you can see at 60fps.
Was that supposed to be an insult?
That's some of the most ignorant ---saint's minion-poo--- I've ever read on this forum. Digital rendering? What the ---fudgesicle--- are you talking about? You realize that there are digital CRTs, right? Jesus you're clueless. You realize that 24fps movies have judder/stuttering at 60hz, right? Watch any movie with a slow panning camera. It's not smooth.
Your brain can process information from your eyes at approximately 30 frames per second. Movies were traditionally shot on 35 mm film, which runs at a rate of 24 frames per second. Any "stuttering" you see on a film, whether played on a projector at 24 fps, or upscaled to run at 60 fps on a digital non-interlaced 60 hz display, is the result of a low frame rate source and is going to look "choppy" particularly when the camera is panning. Thing is, even if G-sync could lower your rate to 24 fps, you would still see this effect because you are now viewing something slow enough for your eyes to catch.
But that is aside from what we are even talking about. The point is, if you couldn't upscale a 24 fps source to run at 60 fps and sync it to a 60hz display, then nearly EVERY frame of a bluray would be torn on the screen.
But it isn't because it upscales the number of frames (just like 120 hz tvs do with the 60hz signals). Mame might not output 60 frames per second, but all that means is a dropped frame (or duplicated frame) now and then. I know this because there is no tearing with v-sync on. And if there is a dropped frame in the video causing a stutter, I can't see it because humans simply can't process the difference between 60 frames per second and 59 frames per second. Perhaps your superhuman eyes can catch those 16 ms frame anomalies, but mine can't. The irony is that traditional arcades are displayed on a 60 hz interlaced display, which is actually 30 frames per second, so even if 1 frame in 60 is dropped, you are still seeing more information than you did on an analog display.
"Sure, lightboost and g-sync will each fix some problems, but you also can't run both at the same time"
Actually, GroovyMAME has black frame insertion, which is essentially a software implementation of what lightboost gives you, so you can have both at the same time.
No, it isn't really, lightboost strobes the backlight between each frame, and since you have to sync the strobe with the frequency of the monitor, you can't do it with G-sync. Every article about G-sync has specified this, and trust me, if it was as easy as just inserting a black frame between frames, NVidia would have done it already. PLUS, it is widely known that lightboost is completely ineffective below 120 hz (60 fps with strobes in between each frame).
What GroovyMame is probably doing is running mame at 30 fps and inserting a black screen after each to get 60 fps, hence eliminating frame dropping if the emulator can't kick out a true 60 fps and giving you a more accurate "arcade" experience. This is basically another method of v-sync (throttling to match the refresh rate of the monitor).
And both technologies only work on TN screens, they can't achieve the results on any IPS screen, so you have crappy colors and bad off axis viewing.
Actually, both G-Sync and lightboost work on IPS panels. There is nothing about either technology that's specific to a certain type of LCD panel. Overlord Computer is working on an IPS G-Sync monitor right now. I'm curious, can you type ONE sentence that doesn't have multiple glaring loads of ---saint's minion-poo--- in it?
WRONG again, completely and utterly. Read ANY article about G-sync. Both lightboost and G-sync are not available in any kind of panel other than TN. This is the biggest argument against either technology.
First, IPS cannot even come close to the low latencies that TN panels can achieve. Second, even the IPS panels that supposedly are overclocked to 120 hz have such high latency that it negates any benefit you get from it. You can't reliably or effectively overclock IPS panels to give any kind of real benefit to either screen tearing issues or ghosting issues. This is why the only overclocked IPS panels come from Korea.
"And resolution has nothing to do with it."
I never said that resolution had anything to do with motion smoothness. I said that <= 1080p LCDs suck ass for arcade emulation. Nice strawman argument. Games in MAME look like ---steaming pile of meadow muffin--- without HLSL. HLSL requires 3840x2160 to be able to output an image with even scanlines and a decent shadowmask. Even 1440p is insufficient. Resolution makes a HUGE difference for quality arcade emulation.
WTF are you talking about? I run HLSL on 1080 just fine. Granted, I am not using a magnifying glass and looking at the screen from 2 inches away and shouting "AHA, I see an anomaly, this screen sucks!" but I guess not everyone is as much of a perfectionist as you are.
BTW, what you said was "You can do things with a high resolution LCD that you can't do with an arcade monitor. Playing Ultra Street Fighter IV at 1440p is one of them." Now you are saying even that is worthless because it isn't UHD? My 1200 line monitor works just fine when emulating a 240 line game, thank you very much, and it sure as hell doesn't require me to run over 2000 lines of resolution to accurately emulate 240 lines.. Again, WTF are you even talking about?
Your whole post is a ton of misinformation, confusion about how technologies actually work, and the idea that it's foolish to spend a lot of money on a quality setup. Quality costs money, the lineage of arcades is about using the highest quality components available to create amazing game experiences, and you're completely full of ---steaming pile of meadow muffin---.
I think it is you who is confused. And as far as quality goes, I spent years selling and installing the best equipment money can buy, and as a result I have an appreciation for quality but also understand that sometimes what people use is perfectly acceptable to them, and that there is nothing wrong with it. Like I said, if you need to spend $1000+ on a display to play a 40 year old arcade game to your liking, then more power to you. But it doesn't change the fact that what I and pretty much every person on this forum use works perfectly, FOR US.