Main > Monitor/Video Forum
HDMI to Old CRT
Zebidee:
Thanks for sharing that schematic.
The LM555-based timer circuit is an interesting approach. I've achieved same effect by simply using a capacitor (say, 100uF) across the TV power button terminals (you can also add a resistor in series to increase the delay). However, the timer circuit is elegant and may work in situations where the cap trick won't.
Concerning the RGB input lines... I guess those 1.3k resistors are part of the OSD-MUX RGB mod. I'd normally expect to see those 100nF caps AFTER the resistors, and physically as close as possible to the jungle/OSD IC. I don't know why they are on that input breadboard and before the resistors. However, for all I know, there may be extra caps inside the TV as well, closer to the IC. :dunno
That circuit to combine the sync... basically just an NPN transistor, which will give you AND logic sync. This is probably OK for games at 240p and other non-interlaced video modes, but not so great for interlaced modes (like those you typically use for the PC's desktop). This is because it'll lose all the horizontal timing pulses that are normally expected during the vertical sync interval. Some TVs can't actually lock on properly without them, at least for interlaced modes. It also won't work with anything but negative sync on both H&V.
Even if you just twist the wires together you'll still get those horz timing pulses during the vertical sync period, but their polarity will be reversed :)
So, there are other options to try for the sync, including some that are quite simple (e.g. generating composite sync output via VMmaker).
If you want to know more about sync variants, I suggest you have a look at this blog post from Ste of Retrovision.
Also, VGA sync voltage will be too high for TVs, and there is nothing on the sync lines to pull the voltage down except those 100ohm resistors, which are not sufficient. You could put another resistor in series *after* the transistor, say ~470R/510R/680R to pull the sync voltage down from ~4-5v to something like the ~0.3v expected by TVs.
Lastly, the blanking signal appears to be raw 5v inserted directly. No diode or current limiting resistor or anything? The TV only has to detect the voltage. Another thought: often there are multiple blanking voltages for different modes. TV might actually be happier with something like 3v. Modders will often put in a basic voltage divider, involving a potentiometer, to "tune in" the blanking signals. Once the needed blanking voltages are known, the pot can be replaced with permanent resistors.
For convenience, I've linked the schematic below.
abstract3000:
@zebidee thank you sir! I greatly appreciate the time you took to provide that feedback. My friend set this up apparently to act as a cha monitor like the original arcade ones without truly understanding the context I suppose in which the monitor was being used.
If I used crt_emudrivers csync option would that just entail disconnecting the h&v sync and leaving the pins 13 &14 with nothing attached?
If you have some ideas on how to approach the synch issue I could scan the actual schematics for the television and upload them for you to view?
Again thank you so much for taking the time :)
buttersoft:
Composite sync from crt_emudriver would be delivered on VGA pin 13. I think, based on the above, you'd just bypass the H- and V-sync combiner circuit and feed the c-sync directly to the TV
Zebidee:
--- Quote from: buttersoft on September 30, 2023, 07:33:53 pm ---Composite sync from crt_emudriver would be delivered on VGA pin 13. I think, based on the above, you'd just bypass the H- and V-sync combiner circuit and feed the c-sync directly to the TV
--- End quote ---
Yes, do exactly this, except feed the composite sync from pin 13 through a resistor (1 suggest 1K, but really anything from 470R to 1K should be fine). Then disconnect pin 14 altogether, otherwise you might get ghost signals. This is the simplest neat solution.
If you want to get fancy, or your friend does, you could try this XNOR composite sync circuit by Tomi Engdahl.. It only needs one quad-logic chip (four XOR logic gates on a single chip), a couple of caps and a couple of resistors. There are a few circuits on that page, so I've linked to a pic of the one I mean below.
Ste (Retrovision) also suggests a good/simple XNOR sync circuit in his blog article (linked above) using 3 different logic chips. This definitely works well, though I'd suggest some decoupling ceramic capacitors (100nf on H, 1uF on V) in series on the H+V sync inputs, and similar value bypass cap(s) for the +5v Vcc inputs (bypass caps, between Vcc and GND, help stabilise the voltage supply).
As suggested above, add a ~1K resistor in series to the sync outputs of whatever option/circuit you use, to pull the voltage levels down to ~0.3v for TV.
If you want to be really clever and make the sync even tighter, add a schottky diode (IN5817, IN5818, IN5819 all good) after the resistor, between the sync output line and GND, in reverse-bias orientation (anode to GND, cathode [end with stripe] to sync). The diode will "clamp" the sync to GND, effectively giving it a reference and making it more stable. Some won't bother, but the diodes are cheap and I find it can help in some cases. I suggest schottky diodes as they are good for fast signals and, being germanium based, have a convenient voltage drop of ~0.3v.
Can draw my suggestions onto a circuit diagram if you are seriously interested.
abstract3000:
@Zebidee or @buttersoft
I find myself in between a rock and a hard place, as I sit as the middleman passing information between them with my knowledge painfully lacking. I have a friend who knows electronics and can do what's asked, but may not necessarily understand the context of what's being asked or what's trying to be achieved. Whereas here on the forums We have experts that know the context and what is trying to be achieved.
Today i got some disappointing news in which my friend believes is limitations of the software not outputting a Digital signal but rather analog signal. Here are the points he made:
* Software only appears to produce analog RGB output, Suspected that a Card capable of Digital RGB would make the CGA option show up and switch the signal from analog to digital.
* The Jungle IC Input appears to be locked at Digital (Has no way of knowing if it will take analog) Though if it did it would require a software change that is not in the service menu custom reprogramming of the EEPROM
* Circuit interface works fine with a true CGA (TTL) Signal, though does not believe analog signal was ever considered as there is no decoupling capacitor or shunt resistor in the RGB feed from the CPU to Jungle, Just a Single Voltage dropping Resistor inline.
* Believes the only way to do it would be to send a true CGA signal or interface directly to the neck board (Which he does not recommend)
I'm not sure in particular why he keeps bringing up CGA? I have photos of messing with the board and briefly getting the signal to show up so I know the Software does have the capability of moving the Image to the TV but his response to that was as Follows:
When i was touching the board I was passing the 5V (From turn on circuit) to the signal pins effectively pulling up the signal to close to the TTL Levels CGA used.
He stated he could add resistors to do that in the circuit but he sees 2 potential issues:
- I could potentially lose a lot of Contrast
- He also is worried about running injected voltage back through the video card may cause damage.
(He could decouple with capacitors, but stated that might but us back where we started)
The last bit of advise was not to confuse the signal itself with the resolution. CGA is a TTL level signal meaning the 3 colors only have two modes (on and off) The CGA color pallet can only produce 16 colors which is fine for a TV's OSD. The interface injects the signal from the computer into the jungle at the same spot the CPU injects the OSD.
So at this point I'm not sure what to do, would it be helpful to scan the TV's Schematics I have?
Does CRT Emudriver have the ability to push out a digital Signal opposed to an Analog Signal?
Is he right in his assumption that injecting the 5V would cause issues with the contrast and potentially ruin the Video Card?
I really appreciate any time or assistance :)