Well, I did a load more testing on this and unfortunately have decided to give up on GroovyArcade/Linux for now.
Even with the fixes to xorg.conf described above, I am still seeing artifacts in super resolutions. The artifact is a triangular-shaped area in the upper right of the screen that lags one frame behind. It looks like an incomplete blit vs page flip or something similar rather than a signal related problem. Uncommenting the "ShadowPrimary" option in xorg.conf seems to fix this but causes the framerate to be halved.
There is also an issue that causes periodic stalls of 3-4 seconds when super resolutions are enabled for vertical modes on a horizontal monitor.
We are already aware of the unresolved video playback problems with kernels >= 4.14, and the problem with interlaced vblank which requires a kernel patch.
Getting the latest version of the X server requires a full system update, as Arch doesn't support partial updates. Unfortunately I have found that when running an older (4.10) kernel with Arch latest, the system is unable to load libQt5. The older kernels also don't compile under Arch latest. The latest MAME binaries require latest GLIBC and libQt5 and won't run under an old Arch system. The latest MAME source doesn't compile cleanly under an old Arch system either.
So right now the choice seems to be to run an older Arch system and kernel, have working video playback but be limited to an older X server and MAME version, or run a later Arch system with latest MAME but have no video playback. That's on top of the unresolved issues with super resolutions.
I also tested a Win7/CRT Emudriver/GroovyMame setup and did not observe any issues with super resolution artifacts, super resolutions in vertical modes, interlaced vblank or video playback. Win7 also doesn't suffer from the library/kernel/compiler versioning nightmare of Arch.