There are two things to consider here: horizontal active video time and the number of pixels crammed into that time. The two are completely independent, but are limited by the analog bandwidth of the monitor.
The width of the picture is determined by the active line time. Each line is surrounded on both sides of horizontal sync by blanking intervals sometimes known as porches. During these intervals, as well as during horizontal sync, black is sent to the monitor. In between these intervals (but not during vertical blanking or sync), active video is sent to the monitor. A game which has short porches will be "wider" than a game with long porches since its active line time is larger (total line time being the same, which is the case if you take both games to be running the same vertical resolution at the same vertical refresh rate).
The number of pixels per line is determined by how fast the video output changes state. The hardware limitations on the game board end are memory (have to have enough memory to store all the pixels and scan through them) and the update rate of the video DAC. The limitation on the monitor end is the analog input bandwidth of the monitor. You can calculate the "pixel rate" or "dot clock" as hsync_rate * total line pixels (which would include blanked pixels). In order for the monitor to display the picture with significant blurring, the analog input bandwidth must be at least this rate, and preferably at least 2x this rate. For a feel of numbers here, a typical dot clock for a standard res output might be 7.68MHz (512 total pixels at 15kHz). You can usually find a specification of analog input bandwidth in the databook for a monitor.
Typical analog input bandwidths for standard res monitors might be in the range of 10-15MHz. Multisyncs go significantly higher to support higher resolution modes. IIRC, the KT-2914 (Betson Multisync) is about 55MHz, which is just less about 1.5x the dot clock of most SVGA outputs (the highest resolution that monitor supports, and it blurs a little).
The gist of it is that the monitor doesn't care how many horizontal pixels you feed it. I've used a Matrox G100 to output 1440x480 interlaced video before, which is compatible with standard res. The horizontal is way higher than you'd expect, but it brings the dot clock into a range the PLL on the video card can lock onto, and I just stretch everything 2x horizontally, which means everything looks the same in the end. The monitor has no clue what I'm doing this.