FWIW, and maybe the only part of this post to be somewhat on-topic, a couple of years back my setup started hiccuping as described. It was driving me nuts. It turned out that a piece of malware had somehow found it's way onto my cabinet (from web surfing most likely). Whatever code it was attempting to execute was hogging processor cycles and doing it at pretty regular intervals. Once I found it and shut it down, everything was smooth again. Probably not related to your problem, but it might be worth a look.
To continue on from what Howard has been saying... I'm not sure how Mame's video system works exactly but there are ways to get pretty close texel to pixel conversion so it's almost exact (perhaps even per pixel precision). Something to do with the rounding or adding 0.5f, which is because Direct3D works with floating point numbers, surfaces, polys and textures. It is after all a 3D API.
I wonder about this. Just because MAME may be rendering with sub-pixel precision, it doesn't mean that it can be displayed that way. Moving images like scrolls will be smoother with SPP, but static images will still have some manner of artifacting. As computing power and display resolution continue to advance, it's likely that MAME will be able to evolve into something very interesting down the road. I predict that we will eventually see virtual dot-triads, "monitor emulation" if you will, that will allow 100% original display accuracy on ultra-modern hardware. That will take some horsepower and serious developer dedication to pull off, but should eventually be possible.
At the moment, however, there is a poor fit between the documentation aspect of MAME (perfect 1:1 displays) and the current crop of display technologies, namely digital. Even with conventional high-res computer monitors, some artifacting will probably be present with D3D, because there doesn't currently seem to be a way to prevent it from scaling in a manner that is disproportionate to the display resolution. The recent changes allowing pre-scaling have cut down on the excessive anti-aliasing, which is definitely a step in the right direction. But unless there is a method introduced to prevent artifacting, such as an option to limit the screen "object" to a render size which is the largest exact multiple of the original that can fit to the resolution of the user's display, it's difficult to fathom how DirectDraw can be abandoned. It's hard to stay true to the "documentation" aspects without being able to show a full-screen, full-motion display that doesn't exhibit artifacting.
Understand that I am not a D3D programmer, but I am assuming based on my experience with 3D rendering programs that the game "screen" is a polygon (or two) which is mapped in real-time with the images generated by the graphics engine within MAME. If this is the case, why not allow the user to define the extents of the screen object in order to allow one to shrink or expand the game screen beyond the edges of the display in order to maintain a non-artifacting image.
If I'm all wet on the concept, I'm curious to know why. Otherwise, it seems like simple way to get rid of DirectDraw once and for all. And as a side note, Aaron seems too imply that DirectDraw is faster, particularly on less capable video cards, than D3D. I think the reason it's still around revolves as much on the minimalist hardware approach to machine building as anything else.
RandyT