Im not sure i understand this right, is it possible to set the digital only DVI as primary? If not would this mean i would have to disconnect the other analog/digital DVI port to use my pc at normal 1080p resolutions on digital only port?
Both of those ports are very likely DVI-I, regardless of the color, because they have the cross pattern. I've seen DVI-D digital only with the cross too, which is misleading, e.g. in the Asus HD 6450, but they're not common.
From Windows, you can set any output to be primary, secondary, whatever. No limitations. The problem is not that.
The problem has to do with arcade monitor detection. Because arcade monitors are usually not detected by the usual detection protocols, a video card will blank the output where an arcade monitor is attached. However, when a video card detects NO monitors, it will leave one of the outputs unblanked and send video through it. This is the "vga-enabled" output, and I believe it's a legacy behaviour. We exploit this behaviour to overcome monitor detection issues. We often say "find your card's primary output" when we should be saying "vga-enabled" output, because any output can be "primary", it's a software term, while only one of them is the "vga-enabled" or default output, which is a hardware concept.
No, let's say you boot your computer with a pc monitor attached to one DVI-I and an arcade monitor plugged to the "vga-enabled" DVI-I. Because the pc monitor is detected, if the arcade monitor is not, it will turn off its ouput even if it's the vga-enabled, because it already has a working monitor so it won't feel the need to default to the vga-enabled output. On the other hand, if the arcade monitor is detected, even if you force detection by using resistors, the card won't blank its output on boot, and keep the configuration with or without the pc monitor attached.
EDIT, By vga-enabled I don't mean analog-capable output. You can have several analog-capable outputs, either vga or dvi-i, but only one of them is the default, vga-enabled output.