Main Restorations Software Audio/Jukebox/MP3 Everything Else Buy/Sell/Trade
Project Announcements Monitor/Video GroovyMAME Merit/JVL Touchscreen Meet Up Retail Vendors
Driving & Racing Woodworking Software Support Forums Consoles Project Arcade Reviews
Automated Projects Artwork Frontend Support Forums Pinball Forum Discussion Old Boards
Raspberry Pi & Dev Board controls.dat Linux Miscellaneous Arcade Wiki Discussion Old Archives
Lightguns Arcade1Up Try the site in https mode Site News

Unread posts | New Replies | Recent posts | Rules | Chatroom | Wiki | File Repository | RSS | Submit news

  

Author Topic: Solved: VGA2SCART on Linux/Nvidia to CRT-TV  (Read 6619 times)

0 Members and 1 Guest are viewing this topic.

Lomaxx

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 40
  • Last login:April 23, 2012, 02:42:54 pm
Solved: VGA2SCART on Linux/Nvidia to CRT-TV
« on: March 13, 2012, 05:14:10 am »
EDIT: Nvidia-Users should note that the first cable-design, which i used, does not work with nvidia-cards. So I first tried a bit around with an ati-card and later modified the cable, which then also worked with nvidia-cards. So either keep on reading or jump here.

Recently i stumbled across the solder-instructions in this subforum for the VGA-to-SCART-cable and wanted to give it a try with an old CRT-TV which i got standing around. I managed to finish the soldering. The adapter-cable should be correct (tested the circuits more than once), though i didn't implement the power-bridge for automatic AV-switching of the TV. Now am trying to get a picture on the TV from my Gentoo-PC. Unfortunately I do not succeed so far. Here is my current environment:

- TV: Nokia SP63D1 (unfortunately i can't find any detailed info on it)
- graphic-card: Nvidia Geforce FX 5900XT
- Software: Gentoo-Linux with xorg-server 1.10 and lrmc installed

I used lrmc to calculate a Modeline and fiddled around with my xorg.conf, but can't get anything close to a picture. In general a signal seems to reach the TV as while booting I can see tons of short white lines quickly moving and when launching X with different settings manage to get two horizontal, white lines or something that might be something like a  color-output (only hard to see)   Before I go into details of my xorg-settings a general question:

Should a setup like this be possible at all? Or am I trying the impossible? On the one hand I somewhere in the forum read that "all nvidia-cards ... should work fine with 15khz". On the other hand this page or this one, states that Nvidia-cards need a more special cable. However i also tried to get this going on a different linux-pc with a ATI Radeon 9200SE (not 100% sure about the "9200SE", but for sure an ATI) and only got the same results.

In any case of the two cards: Which resolution would I need to configure with "lrmc"? Is there one single resolution that needs to fit the PAL-TV? Or do i have to calculate a modeline for each game-resolution that i want to use (for example with groovymame)?
« Last Edit: March 29, 2012, 08:53:13 am by Lomaxx »

Paradroid

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 687
  • Last login:July 12, 2025, 08:11:33 pm
    • SCART Hunter
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #1 on: March 13, 2012, 07:10:42 am »
Let me preface by saying I know nothing about Linux... :)

I haven't found a SCART TV yet that wouldn't sync to a computer using VGA to SCART so you should be able to get this working. A Nokia brand television?! Certainly something I haven't seen before! I thought they only made mobile phones... What country are you in, if you don't mind me asking?

ATI cards are much preferred to Nvidia for this purpose. I'm not even sure if the scheme you linked to will work with Nvidia. I've built several cables using this circuit and they work fine with both ATI and Nvidia. Either way, I believe your ATI 9200SE should be perfect for VGA to SCART purposes. I'd go with that.

Using GroovyMAME you'll have access to over 100 hundred different video modes. GroovyMAME will match the resolution and refresh rate for each game. Wait 'til you see how smooth the scrolling is! Works great with SCART TVs. You define monitor specs for GroovyMAME (or use the default settings) but you don't have to individually create or tweak each mode you want to use.

Since you're running Linux and there are plenty of Linux geeks running GroovyMAME, I'd jump over to that forum and repost all or part of your message there. Someone with more experience than me will help you out, I'm sure.
My MAME/SCART/CRT blog: SCART Hunter

Lomaxx

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 40
  • Last login:April 23, 2012, 02:42:54 pm
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #2 on: March 13, 2012, 09:05:00 am »
Thanks for your answer, Paradroid.

Regarding your question: I'm from Germany. The Nokia-TV is really nothing special. Google got some images of it.

I've been reading more about the vga-to-scart-adapters and now assume that the one i got does not work with nvidia-cards. I will try some more with the ati-based PC and otherwise head over to the groovymame-forum as you proposed.

Maybe someone should edit Level42's initial post to point out that there various schematics for various cards and link to the replies in the same thread that show the other solutions. I for now will try to see if mine works at all and will report back.

MonMotha

  • Trade Count: (+2)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 2378
  • Last login:February 19, 2018, 05:45:54 pm
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #3 on: March 13, 2012, 12:36:16 pm »
In general, the nvidia stuff seems to work just fine under Linux at low resolutions.  I've never even needed an EDID dongle like is sometimes needed on Windows (but then I may have never used one of the affected cards).  They'll generally do whatever you tell them to, but don't expect to hit the EXACT pixel clock you ask for.

I've had much better luck with oddball modes on nVidia using the nVidia provided binary blob driver ("nvidia") than the 2D-only driver that comes with X.org ("nv"), but the new "nouvaeu" driver may also work.

You may need to force the card to pick the right output.  Not seeing a known monitor (via DDC) on the analog output, it will sometimes switch over to the DVI output.  This can happen even if the BIOS and other startup text was on the analog output (probably because it was on all of them).  There's a driver-specific option to force the issue on this one.  It's buried somewhere in nVidia's documentation.  This is a common problem with JPACs because the JPAC doesn't have the proper 75ohm load on the video outputs, so the card thinks nothing's hooked up.  Your SCART adapter should at least provide that, but, depending on the circuit, it may not.

You may also be experiencing a sync issue.  The nvidia driver should be responsive to all the sync polarity and composite sync options, so you should be able to do whatever you need to.

What modeline are you using?

Lomaxx

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 40
  • Last login:April 23, 2012, 02:42:54 pm
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #4 on: March 13, 2012, 01:04:16 pm »
I tried some more with the ATI-card and groovyarcade. I decided to use groovyarcade instead of manually configuring my Gentoo-system because I am hoping that it does more things on it's own and thus lowering the chance of a misconfiguration. So I installed it to some spare-partition and tried various video-settings. One of them was the vga-1-15khz-pal setting, which IMHO is the one that should fit.
 When i select that mode and attach my Eizo-monitor through VGA than it shows that the mode is using 15.6khz/60khz, but I still don't get anything useful when attaching the TV/adapter. However I DO receive differing, jumping black-white-patterns so the output-port should be right. It's not that the picture is always completely black.

I also rechecked once again the soldering. It's for sure the one in this thread, just that all "returns" are connected to each other and the chassis, but since that is built into the vga-cable (that I use for the adapter) by default and wasn't done by my soldering-scheme, I do guess that this is correct. Is it?

@MonMotha:
While experimenting with my Gentoo-Linux and the Nvidia-card I used the following monitor-section in my xorg.conf together with various nvidia-driver- and screen-settings (twinview etc.):

Code: [Select]
Section "Monitor"
     Identifier     "VGATOSCART"
    VendorName     "Unknown"
    ModelName      "VGATOSCART"
    HorizSync       15.5 - 16.0
    VertRefresh     45.0 - 65.0

    # 640x288x50.00 @ 15.625kHz
    Modeline "640x288x50.08"  12.250000  640 656 712 784  288 293 296 312  -HSync -VSync
    # 800x600x60.00 @ 15.625kHz
#    Modeline "800x288x50.08"  15.375000  800 824 896 984  288 293 296 312  -HSync -VSync

EndSection


Lomaxx

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 40
  • Last login:April 23, 2012, 02:42:54 pm
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #5 on: March 17, 2012, 03:08:34 pm »
I managed to get a huge step further and actually got a working picture, but still got some questions left. First a few notes of what i did:

After checking the cable once again I thought about the TV I am using being the cause of the problem. Back from my playstation times I memorized that the other crt-TV that i have in the house (Nokia 7163 VT Digivision) hat an OSD-menu where i could set "RGB" and other options for the two scart-inputs. So i connected the ati-PC to that TV and got a useful picture rather fast.
 So I thought that either the other Nokia-TV i used before would not support RGB-SCART or I didn't set it up correctly. Unfortunately I didn't know where the manual was lying around. In addition the remote-control wasn't the original anymore and so I was not sure if I already found all options. Luckily after some long searching the manual was found, so i could check it. First I thought that it does not support RGB because the OSD-menu only offered the option to choose VIDEO and S-VIDEO, but then i found in the appendix that something about RGB-input was mentioned.
 The point is that i misunderstood the meaning of the power-bridge (pin 16 and 18) of the vga-scart-cable. I thought it's just there to switch to the channels for the scart-input (which i easily could do by using the remote), but now I know that it is needed by the TV to determine whether a composite- or RGB-signal is being used.
 So i reworked the cable and now managed to get a working picture with the ati-card and the first-used TV. The nvidia-card really does not seem to work with this sort of cable, which is what most vga-scart-pages say anyway.

Now I am fiddling around with the modelines. The only working one i could find was

Code: [Select]
Modeline     "720x576" 15.125 720 778 834 968 576 579 607 625 composite interlace +hsync +vsync

All other ones I tried result in various speed- and direction-differing jumping of the picture, which surprises me a bit since as far as i understand groovymame would change resolutions if i use the "switchres"-mechanism. (Or is that only for vga-monitors not scart-TVs?)
 Also the picture is very 'nervous' when displaying my login-picture or the desktop, but less nervous when playing a video or running mame. Still i wonder if I should be able to get better results. For example the shadows of the ships in "Raiden" (my favorite shooter) are still flickering quite a bit. Something i can not notice when playing it on my LCD-monitor. I suspect the interlaced mode to be the reason for this nervous flickering, though i might be wrong. I don't really know a lot about video-modes, just back from my amiga-times that interlaced modes used to show the same nervous flickering. Because of this I tried to find a mode without interlacing.
 Should running a none-interlaced mode in theory be possible with my setup/TV? Should other resolutions work as well as long as i use "lrmc" to find the appropriate modelines? So far I didn't succeed as already mentioned.

Here are the relevant parts of my xorg.conf I am using so far, though I for now only am seeking general-answers (for the linux-specific part I can head over to the groovy-arcade-forum):

Code: [Select]
Section "Monitor"
    Identifier  "VGATOSCART"
    VendorName  "Unknown"
    ModelName   "VGATOSCART"
    HorizSync   14 - 18.0
    VertRefresh 45.0 - 65.0
    Option      "DPMS"
#       Modeline "800x576pali" 15.38  800 823 895 984  576 580 583 625 -hsync -vsync interlace
#       Modeline "800x288pal-half"  15.38  800 823 895 984  288 290 292 313 -hsync -vsync
#       Modeline "768x576pali" 14.76  768 789 858 944  576 580 583 625 -hsync -vsync interlace
#       Modeline "768x288pal-half"  14.76  768 789 858 944  288 290 292 313 -hsync -vsync
#       Modeline "720x576pali" 13.88  720 742 808 888  576 580 583 625 -hsync -vsync interlace
#       Modeline "720x288pal-half"  13.88  720 742 808 888  288 290 292 313 -hsync -vsync
#       Modeline "704x576pali" 13.50  704 722 786 864  576 580 583 625 -hsync -vsync interlace
#       Modeline "704x288pal-half"  13.50  704 722 786 864  288 290 292 313 -hsync -vsync
#       Modeline "650x576pali" 12.50  650 669 728 800  576 580 583 625 -hsync -vsync interlace
#       ModeLine "720x576@25i" 13.5   720 732 795 864  576 581 586 625 interlace -hsync -vsync


    Modeline     "720x576" 15.125 720 778 834 968 576 579 607 625 composite interlace +hsync +vsync



    # 640x288x50.00 @ 15.625kHz
#    Modeline "640x288x50.08"  12.250000  640 656 712 784  288 293 296 312  -HSync -VSync

    # 720x576x50.00 @ 15.625kHz
#    Modeline "720x288x50.08"  13.875000  720 744 808 888  288 293 296 312  -HSync -VSync
EndSection


Section "Device"
    Identifier  "Radeon 9200 SE"
    Driver      "ati"
    Option      "DynamicClocks" "true"
    Option      "ForceMinDotClock" "14MHz"
    Option      "MergedFB" "false"
    Option      "IgnoreEDID" "true"
    Option      "VGAAccess" "false"
#    BusID       "PCI:1:0:0"
EndSection


Section "Screen"
    Identifier  "Screen 1"
    Device      "Radeon 9200 SE"
    Monitor     "VGATOSCART"
EndSection


MonMotha

  • Trade Count: (+2)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 2378
  • Last login:February 19, 2018, 05:45:54 pm
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #6 on: March 17, 2012, 03:19:58 pm »
One thing I immediately notice is that your "working" modeline has positive composite sync while all your others have negative separate sync.  Many of the other modelines will also be immediately rejected by the X server upon startup as they have a dot clock less than the minimum that you have forced (14MHz) with an option.

Lomaxx

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 40
  • Last login:April 23, 2012, 02:42:54 pm
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #7 on: March 17, 2012, 03:54:05 pm »
To be honest I must say that I took all driver-options from some mythtv-site, that mentioned them, without really knowing what i need. I will try around now, but are you able to give me some recommendations on which options i should leave out or add? I gave lrmc another try a few moments ago, but didn't find a way to get a line that contains "composite".

MonMotha

  • Trade Count: (+2)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 2378
  • Last login:February 19, 2018, 05:45:54 pm
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #8 on: March 17, 2012, 05:05:15 pm »
I just calculate the modelines myself :)

Anyway, you can freely play around with the flags on the end, including sync polarity and "composite".  Those don't affect any of the other numbers in any way.

BTW, the first number in the string of numbers is the dot clock in MHz.  Since many of them are below 14, and you are telling the X server that your video card can't handle dot clocks below 14MHz (Option      "ForceMinDotClock" "14MHz"), those will be immediately disregarded as invalid.  You can read the X server's logs to see what modes it's accepting and rejecting.  The log is in /var/log/Xorg.0.log

You can also play around with all this stuff using xrandr, but the interface is a bit wonky (this isn't something people generally do, so you have to use the most flexible, and consequentially most complicated, interface to xrandr available), and there are some bugs that have been resolved but somewhat recently - so you may still hit them if your distro isn't fully up to date - surrounding xrandr and interlaced video modes on ATI hardware.

You're also use the old "ati" driver.  I'm not even sure that thing's being developed, anymore.  The modern driver is "radeon".  The commercial driver from ATI (which I don't recommend you use unless you need it - it's a bit unstable) is "fglrx".

Lomaxx

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 40
  • Last login:April 23, 2012, 02:42:54 pm
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #9 on: March 20, 2012, 03:04:33 pm »
The lrmc-page offers a rather detailed howto about modeline calculation. Alas this is all a bit too much of information to me. Some things I already knew, but the calculating part is still more confusing to me. Maybe i will try to understand it again later.
 So far I continued to tweak and try various modelines by using lrmc and editing the xorg.conf. I switched over to the "radeon"-driver and "ForceMinDotClock" of "12MHz". By also trying out positive Syncs ("+Hsync" and "+Vsync") and adding "composite" add the end of some modelines, I now have two types of modelines that at least display a visible, not-jumping picture:

- Interlaced modes, which I already mentioned before and which provide the whole desktop, but do flicker. For example:

Code: [Select]
   # 720x576x50.00 @ 15.625kHz
    Modeline "720x576x25.00"  13.875000  720 744 808 888  576 586 592 625  +HSync +VSync interlace composite


- two different modelines for none-interlaced modes, which are slightly stretched vertically when the desktop is shown and where the desktop is not fully displayed:

Code: [Select]
   # 720x576x60.00 @ 15.625kHz
    Modeline "720x288x50.08"  13.875000  720 744 808 888  288 293 296 312  +HSync +VSync composite

    # 625x576x25.00 @ 15.625kHz
    Modeline "624x288x50.08"  12.000000  624 640 696 768  288 293 296 312  +HSync +VSync composite

The non-interlaced modes provide a really nice solid picture, except for the slight stretching and the not completely fitting desktop (at all sides. desktop a bit too big). However i managed to tweak the groovymame-configuration to display a picture which probably is not pixel-perfect, but full-sized and almost in the original aspect-ratio for most games. Just that i have to display games in their original rotation (horizontal games horizontal and vertical games vertical), which puts me into the need of rotating the TV itself if I don't find a better solution. What also helps is the fact that the TV has got two additional zoom-modes, which also affect the aspect-ratio a little. Note that I disabled "switchres"- and "modeline"-usage in groovymame for this , so am using always the same resolution.

So all together i made some progress but still didn't completely reach the goal. My biggest burden is my still little knowledge about the technical stuff. Of the many questions that i have, the following pop up in my mind now:

- Is it dangerous to the TV to run it turned by 90°? Long ago i read that it might damage the TV, but IMHO that's a urban myth which is incorrect. I don't see a reason why it should damage it.

- Can a video-mode that is showing correctly on the TV (not jumping and not miss-aligned) still damage the TV on the long run? From the above-mentioned two progressive modelines the second one with 12Mhz seems to be as quiet as a normal "TV-program-picture". Where the other one at 13.87Mhz made a little more "tube-noise" in some games (as far as i noticed in a short test), but which I still wouldn't consider much of a concern.

- Is it possible to send an interlaced video-mode to a PAL-TV, which does not flicker at all? Or do interlaced pictures always flicker at least a little? The interlaced modes that i tried are not really enjoyable. Yet i read somewhere that PAL usually is always interlaced. Is the normal TV-program interlaced? (would surprise me) That's not flickering.

- Is it possible to find non-interlaced (progressive) modes for a PAL-TV in various resolutions created by a normal  graphic-card that fit those of the larger majority of arcade games? (See:http://www.arcade-museum.com/monitor.html
  lrmc always calculates modelines with a horizontal resolution of 288 for non-interlaced modes. I'm not sure if I would be able to find progressive modes with 576, if i would manage to calculated them manually. The manual of my TV says that it accepts "PAL B/G" and the Wikipedia lists detailed specs for it.

- Do i have to care about the refresh-rate as well regarding the speed of games? I know for example back from the Amiga-age that NTSC- and PAL-games ran at different speed.

Sorry for the long post, but i tried to be precise and have many thoughts in my mind about all this.

« Last Edit: March 20, 2012, 03:10:11 pm by Lomaxx »

Paradroid

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 687
  • Last login:July 12, 2025, 08:11:33 pm
    • SCART Hunter
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #10 on: March 20, 2012, 03:43:26 pm »
Since you're in Germany and, from all accounts, well-suited SCART TVs are abundant there, I'd recommend grabbing a TV has been confirmed as "ideal" for MAME use. That way you're not fighting with the televisions as well as the computer as you attempt to sort all this out. :)

Have a look at my blog for some contenders: SCART Hunter

Also, search for posts by apfelanni. He has posted heaps of good info on various SCART models well suited for MAME.

Good luck!
My MAME/SCART/CRT blog: SCART Hunter

Calamity

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 7463
  • Last login:Today at 04:03:33 am
  • Quote me with care
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #11 on: March 20, 2012, 06:39:44 pm »
Hi Lomaxx,

First of all, check this page for a correct VGA-SCART soldering scheme:
http://www.geocities.ws/podernixie/htpc/cables-en.html

I'm guessing the problem is that you didn't solder hsync and vsync together using the proper resistors, in order to achieve composite sync without forcing it from the video driver.

If you're still interested in modeline calculation, download and try SwitchRes, it's light-years beyond lrmc, promise.

Once you solve the sync issues, you'll probably manage to use groovymame without more restrictions. Now, be aware that the PAL/NTSC options in GroovyMAME are not very useful, you don't really want to use those. Better go for the default cga setting or possibly the h9110. For the desktop (grub option), it's fine if you stick with PAL, but for modeline generation the PAL/NTSC modes have too narrow ranges to be useful at all.

PAL is a TV broadcast standard, nothing to do with RGB signals we're using here. It's interlaced, 576i indeed, believe it or not.

INTERLACE = FLICKER

You'd probably get a less offending flicker if you used the filter options in MAME when using interlaced modes. A TV designed for the PAL area will usually accept a range of vertical resolutions and refresh rates RGB signals, either progressive or interlaced, from the SCART connection, some brands and models are better than others, 100 Hz TVs are out of the equation. Check Paradroid's threads and blog.

A video mode that is stable on your TV and it's within its work ranges (frequencies) is not supposed to break anything, in the same way that cruise ships are not supposed to sink these days.

Taking care of game refresh rates is not only important but the key to smooth, accurate and enjoyable emulation.
« Last Edit: March 20, 2012, 06:42:28 pm by Calamity »
Important note: posts reporting GM issues without a log will be IGNORED.
Steps to create a log:
 - From command line, run: groovymame.exe -v romname >romname.txt
 - Attach resulting romname.txt file to your post, instead of pasting it.

CRT Emudriver, VMMaker & Arcade OSD downloads, documentation and discussion:  Eiusdemmodi

Paradroid

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 687
  • Last login:July 12, 2025, 08:11:33 pm
    • SCART Hunter
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #12 on: March 20, 2012, 11:15:28 pm »
A video mode that is stable on your TV and it's within its work ranges (frequencies) is not supposed to break anything, in the same way that cruise ships are not supposed to sink these days.

:)

Sorry, Lomaxx, Calamity is right: your PAL television should be able to cope with a range of refresh rates, not just the regular PAL 50 Hz. All the SCART TVs I've tried (40+) have been designated as PAL and all been able to sync to 60 Hz (and other refresh rates) fine. However, the image quality and ability to adjust the geometry easily varies greatly between different brands and models.

Just wanted to clarify that! :)
My MAME/SCART/CRT blog: SCART Hunter

Lomaxx

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 40
  • Last login:April 23, 2012, 02:42:54 pm
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #13 on: March 23, 2012, 06:56:16 am »
Thanks for your replies, Paradroid and Calamity.

Since none of you commented on my question about the healthiness of running a TV turned 90° I assume that I do not heavy to worry about it. So far it didn't explode right into my face. :P
 I also tried to upload pictures and a video, but the ones i took weren't really helpful so i deleted them again.

So now I am left with yet another slightly different soldering-scheme. ;) I think I will start all over again and build a cable that is compatible with nvidia-cards (due to the use of the 4070 chip mentioned here) and uses an USB-cable as power-source for the 4070 and (with a resistor) as power-source for SCART-pin 16.Then all that is missing is the +12V for aspect-ratio-detection of pin 8, but I hopefully will never need that and I have no idea where I could easily get it from, as the cable is for external usage (not inside a cabinet or case, so no MOLEX).

This page shows a scheme which uses a transistor instead of the 4070-chip to provide nvidia-compatibility. Which one would you recommend? Does it make a quality difference? Using the transistor would be somewhat easier of course.

Will it work to grab from the USB-5V-source both the power for the 4070-chip (5V) and over a 100-ohm-resistor power for RGB-detection at pin 16 (1-3V) at the same time? I'm not much into electronics, so I'm not sure.

Which type of resistors do you recommend? Carbon film, metal layer, precision-metal layer?

Just to be sure if I am correctly understanding the scheme for the 4070-chip: The "4k7"-parts are 4,7k-ohm-resistors and the "10microF" are capacistors. Any special capacitors (ceramic, electrolyte, film, styroflex, ...)?

You might have noticed that I'm pretty much of an electronic-newbie. ;)


Calamity

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 7463
  • Last login:Today at 04:03:33 am
  • Quote me with care
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #14 on: March 23, 2012, 07:21:21 am »
Hi Lomaxx,

The 4070 chip is for the VGA->Arcade scheme, the one I meant is the VGA-SCART scheme on the top of the page. That one only uses two 1K resistors, so I think it's the easiest one. Anyway I haven't used or build any of them, but the link I provided is from very reliable source.

Rotating a TV 90º is perfectly fine, search the internet for "tate" and you'll see hundreds of people doing this. Just make sure to switch the TV off before rotating or you'll get magnetized areas on the screen.
Important note: posts reporting GM issues without a log will be IGNORED.
Steps to create a log:
 - From command line, run: groovymame.exe -v romname >romname.txt
 - Attach resulting romname.txt file to your post, instead of pasting it.

CRT Emudriver, VMMaker & Arcade OSD downloads, documentation and discussion:  Eiusdemmodi

Lomaxx

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 40
  • Last login:April 23, 2012, 02:42:54 pm
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #15 on: March 23, 2012, 07:58:38 am »
Hi Calamity.

I'm aware that the 4070 is mentioned in the VGA->Arcade-scheme, but i thought for an nvidia-card I could combine the 4070-scheme with the shown VGA->SCART-scheme, which I suspect in that form to only (or mostly) support ATI-cards. So either I should use the transistor which is mentioned in the other link of my last post or the 4070-chip.
 Well, maybe I better stick to the transistor-version before I do something terrible wrong.

It's good to know about the 90°-turned TV and about switching it off before turning it. Thanks. The search-terms I used didn't bring up any useful links.

Calamity

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 7463
  • Last login:Today at 04:03:33 am
  • Quote me with care
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #16 on: March 23, 2012, 09:07:25 am »
I'm aware that the 4070 is mentioned in the VGA->Arcade-scheme, but i thought for an nvidia-card I could combine the 4070-scheme with the shown VGA->SCART-scheme, which I suspect in that form to only (or mostly) support ATI-cards.

The scheme I mentioned already mixes hsync and vsync, so it should work with any card. It's probably equivalent to the one with the transistor. The scheme that's ATI specific is the one you first posted, as it relies on the card being able to output composite sync through pin 13.

Have a look at this:
http://shmups.system11.org/viewtopic.php?t=7715
Important note: posts reporting GM issues without a log will be IGNORED.
Steps to create a log:
 - From command line, run: groovymame.exe -v romname >romname.txt
 - Attach resulting romname.txt file to your post, instead of pasting it.

CRT Emudriver, VMMaker & Arcade OSD downloads, documentation and discussion:  Eiusdemmodi

MonMotha

  • Trade Count: (+2)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 2378
  • Last login:February 19, 2018, 05:45:54 pm
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #17 on: March 25, 2012, 06:25:38 am »
All the reasonably modern (TNT2 and newer, but I haven't tried anything newer than an 8000 series) nVidia cards I've played with have been able to do composite sync (of either polarity) on the HD15 HSYNC output.  The Windows driver often doesn't expose the option for it, but adding +CSync or -CSync to a modeline in X.org has always worked for me.  YMMV, of course.

Also, yes, interlaced modes will likely flicker pretty visibly on computer graphics.  You don't notice it on most TV because the images are either photorealistic (and therefore lacking in sharp lines) or have the graphics specifically designed around the limitations of TV.  Computer graphics often have single pixel high lines which flicker rather badly when interlaced.

Lomaxx

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 40
  • Last login:April 23, 2012, 02:42:54 pm
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #18 on: March 25, 2012, 07:08:30 am »
I modified the cable now and soldered the two-1k-resistors-part thus connecting vga-pin 13 AND 14 to SCART-pin 20 like shown here.
 Also i implemented the connections between scart-17 and scart-18 as well as vga-5 and vga-10. Since vga-10 has a connection to all other shields/grounds due to the design of the vga-cable (not my modification)  all the following pins are connected now:

VGA: 5,6,7,8,10, outer shielding
SCART: 5,9,13,17,18, 21

Additionally for the rgb-input-detection I am using an external PSU (3V) connected to scart-pins 16 and 18

The cable seems to be ok. It's working and now I do not need to specify "composite" and do not have to replace "-hsync/-vsync" with "+hsync/+vsync" anymore. Because of this I can now also use switchres of groovymame.

However there are still issues:

The most strange thing (that I do not understand) is that now TV-picture gets tinted (AFAIR slightly purple) when i turn the TV by 90°. Horizontal way works, tate-mode shows that tint. And I did turn off the TV by using the remote AS WELL as the hardware-power-switch of the case. Also I verified that the plug is properly stuck in. Tried turning it two times with the same result. I'll test it more later.

The desktop-picture when using non-interlaced-modelines is too large (as already mentioned). This also applies for some games when running groogymame with switches (for example Mr.Do!). Is that related to some internal hardware-thing of the TV and can not be changed?

Sound is cracking. This must be related to the vsync or so (?). When I completely remove the xorg.conf and run groovymame without a connected monitor or my LCD-monitor (attached through vga) then the sound is correct.

I wrote this a bit in a hurry, because I got the message that i have to leave while writing it. Will be back later.


« Last Edit: March 25, 2012, 08:37:14 am by Lomaxx »

Lomaxx

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 40
  • Last login:April 23, 2012, 02:42:54 pm
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #19 on: March 25, 2012, 12:19:15 pm »
I tried the cable with my nvidia-card-pc and it works there nicely too. So now I can confirm that this scheme works with nvidia-cards.

In general I am having fewer issues with that nvidia-pc. The desktop is still a bit too large, but so far the games seem to fit nice. Also the few games, which i tried, run without cracking sound. Raiden and DonkeyKong run fine. In Mr.Do there is a very small stuttering in the sound. Maybe I can rid of that somehow.

The best is that i managed to also set up a dual-head-setup (not Twinview), where I can use the TV and my LCD-monitor at once. I can not move windows from one to the other that way, but I just have to move the mouse-pointer over to the other screen launch groovymame or mplayer there and the current screen will be used for output.

The biggest bugger left is the tainted colour when i rotate the TV. Doesn't make me want to use it like that. I'll keep on trying.

Lomaxx

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 40
  • Last login:April 23, 2012, 02:42:54 pm
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #20 on: March 25, 2012, 01:32:32 pm »
hm, maybe i wasn't careful enough while rotating the TV. I found this thread in some other forum.

Before stumbling across that info, I tried it again and the colour-tainting in one corner was still present. Also with the normal TV-program and without cable. So it's the TV itself for sure. Gladly it's still working without issues in vertical position, but I doubt that I will be rotating it anymore.
 Big bummer, but there are worse things.

Calamity

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 7463
  • Last login:Today at 04:03:33 am
  • Quote me with care
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #21 on: March 25, 2012, 02:03:17 pm »
Sound is cracking. This must be related to the vsync or so (?). When I completely remove the xorg.conf and run groovymame without a connected monitor or my LCD-monitor (attached through vga) then the sound is correct.

Hi Lomaxx,

Which GroovyMAME version are you using? There are some sound issues in the Linux builds since v0.145, due to a change in the source of the patch, we're still trying to figure out how to solve that. Any version previous to v0.145 (patch 013e) should be fine regarding sound.

The purple tint is due to magnetization. TVs usually have an auto-degauss system that works when turned on from the main power button, it might be not working in yours.

Overscan is by design in most TVs and is usually fixable through service menu settings. You can compensate for horizontal overscan by software (modeline/GroovyMAME tweaks). Vertical overscan can only get fixed by hardware (potentiometers, service menu).
Important note: posts reporting GM issues without a log will be IGNORED.
Steps to create a log:
 - From command line, run: groovymame.exe -v romname >romname.txt
 - Attach resulting romname.txt file to your post, instead of pasting it.

CRT Emudriver, VMMaker & Arcade OSD downloads, documentation and discussion:  Eiusdemmodi

Paradroid

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 687
  • Last login:July 12, 2025, 08:11:33 pm
    • SCART Hunter
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #22 on: March 25, 2012, 03:48:26 pm »
Gladly it's still working without issues in vertical position, but I doubt that I will be rotating it anymore.

I wouldn't give up so easily. :) Many TVs have a degauss coil that's temperature controlled i.e. they only kick in when the TV is powered up from a cold start.

Just last night I tried rotating my modified Blaupunkt so I could check out how Exerion and Raiden looked on it. I turned the TV off (after playing horizontal) and waited a minute before rotating. When I turned it back on (in vertical position) there were some patches of discolouration. I turned it off and tried again. Same result. :(

I decided to leave it off for 5 minutes and grab a beer. When I came back and switched it on, the image was perfect with no discolouration. :) So, it was either the beer that made the difference or the fact that I let the TV cool so that the degauss would switch on. I guess the other option is that it might take a couple degauss attempts for the patches to disappear.
My MAME/SCART/CRT blog: SCART Hunter

Lomaxx

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 40
  • Last login:April 23, 2012, 02:42:54 pm
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #23 on: March 26, 2012, 09:12:00 am »
@ Calamity:

ATI-computer:     M.A.M.E. v0.143 (Mar 17 2012) SwitchRes Patch 0.013b
NVIDIA-computer: M.A.M.E. v0.143 (Feb 12 2012) SwitchRes Patch 0.013b

Installed on Gentoo through a /usr/local/portage-ebuild most likely taken from here. (can't recall it, but the link was marked as "visited" ). Don't worry about the stuttering for now. At the moment I only plan to use the Nvidia-PC for groovymame. As already mentioned I don't get much of that stuttering there. A few moments ago I even had no stuttering in "Mr.Do!" probably because i once again used a different configuration of xorg/groovymame.

Big thanks for pointing me to the service-menu. I didn't even know something like this existed. Through some wondrously way I even managed to find the service-menu-code for my Nokia-TV without finding 100% correct service-manual. Seems to be some (one of many) Nokia-standard.

@Paradroid:
You are right. Today the first thing i tried was to rotate the TV without turning it on before and all colour is fine. Seems I still can dare to rotate it carefully. The best thing about your hint was the reminder of getting a beer while waiting. :D


Now about the service-menu: I don't want to play around with options that I don't know. Has anyone of you got an idea which of the following options might be the one I have to adjust? I will list them all, though I know some are not related for sure.

    - V.MID-POS: 34
    - ZOOM: EIN
    - AGC 05
    - CORING: AUS
    - FLYB MODE: AUS
    - VT CHAR: WEST TURKEY
    - CARRIER-MUTE: EIN
    - NICAM: AUS
    - C4 BIT CHECK: EIN
    - LOUDNESS: EIN
    - FRONT AV: AUS
    - SPORT: AUS
    - INVAR: AUS
    - OSD SHIFT 42
    - RED 60
    - BLUE 49
    - GREEN 49
    - S.COR. 31
    - P.CORN. 13
    - P.TILT 27
    - P.AMPL. 17
    - H.AMPL. 49
    - H.SHIFT. 28
    - V.AMPL. 35
    - V.TOP-POS. 38

Anything of this related to overscan? Anything else that is worth fiddling with? Anything I should not touch at all? ;)
« Last Edit: March 26, 2012, 09:13:34 am by Lomaxx »

Paradroid

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 687
  • Last login:July 12, 2025, 08:11:33 pm
    • SCART Hunter
Re: Need help: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #24 on: March 27, 2012, 02:24:55 am »

- V.MID-POS: 34
- ZOOM: EIN
- AGC 05
- CORING: AUS
- FLYB MODE: AUS
- VT CHAR: WEST TURKEY
- CARRIER-MUTE: EIN
- NICAM: AUS
- C4 BIT CHECK: EIN
- LOUDNESS: EIN
- FRONT AV: AUS
- SPORT: AUS
- INVAR: AUS
- OSD SHIFT 42
- RED 60
- BLUE 49
- GREEN 49
- S.COR. 31
- P.CORN. 13
- P.TILT 27
- P.AMPL. 17
- H.AMPL. 49
- H.SHIFT. 28
- V.AMPL. 35
- V.TOP-POS. 38



I've highlighted the menu items that I think are to do with geometry and sizing. The others look like factory preset feature options and drive/cutoff controls (use for colour balancing).

You've already done the #1 thing you should do before experimenting with the service menu: take note of the original values! I can't imagine you'll go to far wrong if you now start to slowly +/- some of those values and observe the effect each has on the image. You can always punch the original values back in if you muck up.

Basically, you want to find the controls for V-size, V-position, H-size and H-position. Then, if there are pincushion and trapezium controls, that's a bonus (my guess is that these are the "P.x" controls). With these, you should be able to get the image perfectly sized and shaped. One of those controls might be related to vertical linearity. The effect of this is subtle when adjusting but is important if you want the image to be even (running up and down the screen).

Good luck! :) Once you sort it out, you'll soon get sick of getting in and out of that service menu if you plan on changing resolutions often. ;)
My MAME/SCART/CRT blog: SCART Hunter

Lomaxx

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 40
  • Last login:April 23, 2012, 02:42:54 pm
Re: Solved: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #25 on: March 30, 2012, 12:19:15 pm »
I decided to mark the thread as solved, as most basic issues, that I had regarding the cable and TV, have been solved. Big thanks for all your help.

I think I will only adjust the TV once to get rid of the overlay-effect. And not for each resolution to get the perfectly filled picture. I don't mind if there are black bars left at the borders. As long as I can use a none-interlaced mode and have the right aspect-ratio I am fine. Luckily the TV got that zoom-option, which will also help me. So I will try to find a pretty good setting for running other emulators in fullscreen (pcsxr, stellar, x64 (still need to find out how to get a correct fullscreen-output there).
 Also I am still working on the xorg-configuration in order to improve the result. I need to get rid of the larger virtual-desktop and don't want the mouse to move over to the second screen as a standard.
 But for all that I will sooner or later head over to the groovymame-subforum, where it IMHO fits better.

However there still is still a questions regarding the video-hardware:

Did all (or most of the) arcade-games had a 4:3 ratio (no matter if vertical or horizontal)  and completely fill all the screen? This site lists the most used resolutions. But when  comparing the vertical and horizontal resolution, then I don't always end up with a 4:3-ratio. Maybe because the "pixels" in the end (on the crt-monitor) have not been square? In anyway I noticed that mame-games, that i tested on my scart-tv, differ in the ratio. That might be due to my mame-configuration, since - as far as i noticed - I have to keep the option "keepaspect" set to "0".
 I don't want to go into the depth of mame-configuration here. I just wonder if in the ideal case each game should fill all of my 4:3-TV or if there are games that will leave empty space at the top/bottom or left/right, because otherwise the ratio would be messed up.

Paradroid

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 687
  • Last login:July 12, 2025, 08:11:33 pm
    • SCART Hunter
Re: Re: Solved: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #26 on: March 30, 2012, 06:46:30 pm »
Well, I like games to completely fill the screen. Sometimes I leave a small border at the top and/or bottom (a few millimetres) so that the score and credit text aren't clipped. I usually have just enough overscan so that the sides are clipped by the screen mask. Looks neater that way.

Once that's done, the game should be pretty close to 4:3 regardless of the actual resolution.
My MAME/SCART/CRT blog: SCART Hunter

Calamity

  • Trade Count: (0)
  • Full Member
  • ***
  • Offline Offline
  • Posts: 7463
  • Last login:Today at 04:03:33 am
  • Quote me with care
Re: Solved: VGA2SCART on Linux/Nvidia to CRT-TV
« Reply #27 on: March 31, 2012, 03:00:06 pm »
Did all (or most of the) arcade-games had a 4:3 ratio (no matter if vertical or horizontal)  and completely fill all the screen? This site lists the most used resolutions. But when  comparing the vertical and horizontal resolution, then I don't always end up with a 4:3-ratio. Maybe because the "pixels" in the end (on the crt-monitor) have not been square?

Pixels weren't usually 'square' (pixel aspect wasn't 1:1). So games were actually 4:3 because the target monitors were 4:3, regardless that 320/224 or whatever is not usually 1.3333. What happened is that operators adjusted their monitors for each specific game so that they covered the screen vertically. This separates the scan lines more or less depending on the vertical resolution. Usually this is required in order to achieve really round circles in many games (explosions, etc.) so it seems to have been expected by game designers.
« Last Edit: March 31, 2012, 03:08:51 pm by Calamity »
Important note: posts reporting GM issues without a log will be IGNORED.
Steps to create a log:
 - From command line, run: groovymame.exe -v romname >romname.txt
 - Attach resulting romname.txt file to your post, instead of pasting it.

CRT Emudriver, VMMaker & Arcade OSD downloads, documentation and discussion:  Eiusdemmodi