Main > Main Forum

LED wiring question with diagrams

Pages: << < (3/6) > >>

SavannahLion:


--- Quote from: bfauska on May 08, 2007, 07:13:18 pm ---I am totally uncertain about this but it seems to me that you could wire two serial pairs of LEDs in parallel, assuming that the LEDs were rated for >2.5v each.

Is that not correct, I could swear I have looked around and seen equations that make this sound correct?  Randy?  I would totally trust your opinion on the matter if you happen to check this thread again.

--- End quote ---

Thinking of this maybe?

bfauska:

I hadn't seen that, it seems cool.  I still don't understand why it insists on putting resistors in every example I come up with though, including 2v source, 2v LEDs, any number.

MonMotha:

An issue is that LEDs are specified with a "typical" forward voltage drop at the rated current.  They are quite non-linear in terms of their current/voltage characteristics, so the resistor is used to keep the current in check across process variation.  Your power source is also a "typical" rating, and it may vary some (+/- 10% is not uncommon for a large power supply!).  This variation is the reason for avoiding paralleling of LEDs as well as the reason why designers almost always put a series resistor in and run the LED from a supply slightly higher than its rated forward voltage drop.

Consider: you have a 2V source and are given a "2V @ 20mA" LED.  Your 2V source typically can supply several times this current.  Say that your "2V" LED exhibits only 1.85V drop at 20mA.  Say also your voltage source is running a little high and is putting out 2.2V.  At 2.2V, this specific LED might conduct 40-60mA.  Remember, this isn't a resistor - the dependency of current on voltage is not linear, it's actually exponential!  60mA at 2.2V is 132mW, compared with the expected power dissipation at 1.85V/20mA (the LEDs are rated for current) of only 37mW.  Clearly, this is several times larger.  In fact, it's so much larger that the LED will probably be damaged.

Consider this same situation, but we will instead use a 5V supply.  We choose the resistor for the "typical" characteristics of 2V Vf, 20mA, 5V supply and arrive at a (5-2)/0.02 = 150 ohm resistor.  Now, let's say that our supply is running at 5.2V and the LED is dropping 1.85V (the current is unkown - you have to characterize the LED to figure it out, which I won't do).  If you solve for current, you get 22mA, very close to the rating (and you always derate a little to account for a situation like this).

As you can see, this makes the system much less susceptible to component variation.  What you're doing is swamping the non-linear behavior of the LED with the linear behavior of the resistor.  The only expense is the resistor itself and the small power dissipated in the resistor (~60-70mW in this case).

Another option is to run the LED from a voltage source which, even accounting for tolerance in both parts - usually given, will always be below the specified forward voltage of the LED at rated current.  For example, if we again assume our LED process results in a lowest-case voltage of 1.85V, and our power supply is +/- 10%, we can use a 1.85-0.185 = 1.665V supply.  This will come at the expense of usually resulting in a significantly dimmer LED, but it does reduce cost and is often done when you want to run an LED straight off batteries such as in a keychain flashlight.

If you don't like my simplification of not characterizing the LED, you can iterate a few times and come reasonably close to the actual solution: at 22mA, the LED might drop 1.87V instead of 1.85 (since our presumption was 1.85@20mA), for example, giving a current of 22.2mA (instead of 22.3).  As you can see, it won't change much.

bfauska:


--- Quote from: MonMotha on May 09, 2007, 03:16:39 am ---Remember, this isn't a resistor - the dependency of current on voltage is not linear, it's actually exponential!
--- End quote ---

So if I am thinking of LEDs the same way I would think about the power going through a series of light bulbs, I am thinking wrong?  With a lamp, if it is rated at 6v and I run 2 in series hooked to a 12v supply I will not blow the lamps because they DO act as a resistor, correct?  And what I am hearing is that it is not the same with an LED?  I think I have been oversimplifing LEDs in my head.  If I am understanding what you are saying... even if you had a perfectly stable PS you would not be able to run 2 2.5v LEDs in series on a 5v PS w/o a resistor?

Level42:

What I've always understood it that it's the current what matters to a LED, not the voltage.
You will always need some resistor in series with the LED's, unless maybe youput 20 or so LED's in series.

I put three led's in series (for my three buttons per player) with a single resistor and this works fine. I don't see the point of using three seperate resistors and wiring in parallel for this type of application. LED's last "forever", as long as you don't do crazy things with them....

Pages: << < (3/6) > >>

Go to full version