I'd like to write an update on this since I've been testing it thoroughly for the last two weeks. I've tested both Windows 10 and Linux on it, and here I'll write some notes of what I've found, for other people interested.
This is my build (end of June-2019):
ASRock DESKMINI A300 124,00
AMD Ryzen 5 2400G 126,21
HyperX Impact 8GB x 2 123,46
HP EX920 256GB SSD NVMe M.2 60,94
WD Scorpio Blue 750 GB SATA 0,00 (recycled from laptop)
------------------------------------------------
434,61
- First, I'd like to remark that these APUs are different to anything I've tested so far, in a lot of ways. The commercial way AMD labels its products as "Vega" is misleading. So, while we can use mainly the same techniques for discrete video cards from the HD 5xxx family up to the Vega 56, these RX Vega 11 APUs are certainly different, and some critical stuff simply won't work. AMD refers to this hardware as "Raven", and it seems it has its own specific code inside the drivers
- The most shocking difference about this hardware is that it
DOES NOT SUPPORT INTERLACE. Neither on Windows nor Linux. In Windows, interlaced modes are rejected by the driver. When patched to force interlaced modes being accepted, the result is 30 Hz-progressive, which means the retrace for the odd field is missing. In Linux, the drivers don't reject the mode, but instead some blurry, downscaled 240p mode is activaded, with halved width.
- Working with a desktop at 240p is really unconvenient and practically unusable, even for me that's too much. I've found 768x288@50 a bit more usable, though equally not enough in many situations. The conclusion is that you no longer can do any reasonable setup job on a resolution like this, and you need to resort to a PC monitor or a remote desktop for anything other than using a frontend to lauch games. This applies for both Windows and Linux.
- When booting in DOS, video support (BIOS video modes) is different to anything I've seen, it's like if it's being emulated. I've tried some DOS programs that switch video modes (both normal BIOS modes and VESA modes), and none of them work. The screen is stuck at 48 kHz. It reports mode 3 as the active one. It doesn't matter if I use VGA or HDMI. Activating a VESA mode results in garbage on the screen. I've dumped the VBIOS and patched it with ATOM-15, then forced it active in RAM (not flashing!) and it doesn't result in the expected 15 kHz output, instead the right half of the screen shows garbage but the output frequency is unmodified. No interlace support here either.
- In Linux, the first thing you need to know is that these APUs are NOT supported by the old "radeon" driver, but by the new "amdgpu" one. Once you have the correct driver, you need to know that the VGA output is actually DP-3 from the kernel point of view. So now in the kernel boot line you have to add "video=DP-3:640x240ec", and in the kernel patches for custom resolutions you need to add 640x240, since this hardware does not support interlace. In X, DP-3 maps as "DisplayPort-2", etc. I've tried all this with the help of Ves, and aside of the issue with interlaced not existing, everything else works once you figure this out.
- In Windows 10 (1903, Adrenaline 18.5.1), the biggest disappointment is EDID emulation is not supported. This breaks the whole configuration recipe we've been using since the days of HD 5000. EDID emulation fixed two issues at once: it helped with display detection and it filtered out undesired modes. As a note, EDID emulation has never been supported anyway for DP connectors, but I hoped it'd work on the HDMI output (it doesn't). Fortunately, although it doesn't fix detection, ToastyX's CRU works just as well for EDID emulation from the Windows OS point of view. So now it's a matter of using this tool to create a custom EDID with 15 kHz modes, which (attention) must be progressive, otherwise they'll be ignored. VMMaker works as usual either with normal or super resolutions, although anything interlaced will be rejected.
- With regards to the VGA output in the deskmini A300 box, I have to say I'm really surprised to find it's perfectly usable, even in the BIOS setup (with a 31 kHz monitor in this case, obviously). Being a DisplayPort on the inside means that the motherboard must have some chip for digital to analog conversion, that does a really good job. It was a nice surprise to realize that there's no low dotclock limitation for this output. This is not the case for the HDMI output, which is not usable for 15 kHz precisely for this, at least with the current patches in this driver (maybe some extra limits must be removed). But in my experience, even with the limits removed, HDMI has always had remaining limitations that virtually forced you to use super resolutions. Well, this is not the case with this "VGA" output, and it's good.
- The "dark" side of this chip is that it seems to cut the video signal as soon as it doesn't feel a load in the RGB lines. For those of you who've been around some time you know this means detection problems. The specific case where this is an issue is with arcade monitors connected through a J-PAC. Since the J-PAC doesn't load the lines with 75 Ohm, you'll get a black picture whatever you do. It doesn't help to force an output enabled from Linux boot line: the OS will enable the output but video won't pass through the DAC chip. It won't work to use EDID emulation either: Windows will enable the desktop and everything but video will get stuck in the chip. The only thing that works is to fake the 75 Ohm load with three resistors between the R-G-B lines and their respective grounds. You can do it in the actual VGA cable. I tried something more fancy by modding a
Soft-15kHz EDID dongle I had around. The downside of this trick is that it dramatically cuts off the picture's brightness. Since this is mainly required with a J-PAC and this already has a video amplifier, I believe I'll manage to compensate the brightness up using the potentiometers in the monitor's chassis. But it'd be nice to find a way to fake this load without affecting picture quality. It must be noted that I've only seen this detection issue with my cabinet where the J-PAC is, but it doesn't happen with my BVM, which just appears detected as a "Generic non-Plug-n-Play" monitor.
- Before you ask, this DAC does NOT add any latency to the VGA output. I've measured it: next frame response, consistently, for Windows 10 + frame delay. Unfortunately (again) Linux lags one frame for the same frame delay configuration, -video opengl.
- With regards to performance, this is a 4 core CPU and its single thread performance isn't great (1944 in Passmark), but I've found it runs fine everything I've tried and usually with frame delay 7. Obviously the more demanding drivers required to lower frame delay.
- The lack of interlaced modes is probably the worst, and quite probably unsolvable issue to consider this a viable emulation box. I must say that I was so used to having interlaced modes available that not having them to play vertical games for instance seems like something is missing. However, once you try those games on a progressive, scaled-down mode, it doesn't look so terrible. Please don't get me wrong, playing vertical games that way sucks deeply, but so it does using interlaced modes. I mean, the extra sharpness you get comes at the price of flicker, so it's hard to say which way sucks more. And people have always complained about flicker anyway. Another story are systems that, natively, worked in interlaced modes, or were able to switch from progressive to interlaced. For these, at least a proper de-interlace method might be implemented. What I've found is that GroovyMAME is not properly suited for this case, since I've always been conceptually against downscaling. But I may have been wrong. I'd really like to hear Recap's comments on this.