The resistor isn't just to reduce the voltage! LED's offer practically no load; if there is no other load on the circuit, the current draw will be near infinite (Ohm's law). Thus, snap, crackle, pop!
Most LEDs have their characteristics specified at a current of 20 mA. If you want really good reliability and you are not certain you don't have worse-than-average heat conductivity in your mounting, heat buildup in wherever you mount them, voltage/current variations, etc. then design for 15 milliamps.
Now for how to make 15 milliamps flow through the LED:
First you need to know the LED voltage drop. It is safe enough to assume 1.7 volts for non-high-brightness red, 1.9 volts for high-brightness, high-efficiency and low-current red, and 2 volts for orange and yellow, and 2.1 volts for green. Assume 3.4 volts for bright white, bright non-yellowish green, and most blue types. Assume 4.6 volts for 430 nM bright blue types such as Everbright and Radio Shack. Design for 12 milliamps for the 3.4 volt types and 10 milliamps for the 430 nM blue.
You can design for higher current if you are adventurous or you know you will have a good lack of heat buildup. In such a case, design for 25 mA for the types with voltage near 2 volts, 18 mA for the 3.4 volt types, and 15 mA for the 430 nM blue.
Meet or exceed the maximum rated current of the LED only under favorable conditions of lack of heat buildup. Some LED current ratings assume some really favorable test conditions - such as being surrounded by air no warmer than 25 degrees Celsius and some decent thermal conduction from where the leads are mounted. Running the LED at specified laboratory conditions used for maximum current rating will make it lose half its light output after rated life expectancy (20,000 to 100,000 hours) - optimistically! You can use somewhat higher currents if you heat-sink the leads and/or can tolerate much shorter life expectancy.
Next, know your supply voltage. It should be well above the LED voltage for reliable, stable LED operation. For an IPac or keyboard hack, this is almost certainly 5 volts. Use at least 3 volts for the lower voltage types, 4.5 volts for the 3.4 volt types, and 6 volts for the 430 nM blue. (Darn... I
really wanted to use blue LED's on my panel!)
Next step is to subtract the LED voltage from the supply voltage. This gives you the voltage that must be dropped by the dropping resistor. Example: 3.4 volt LED with a 5 volt supply voltage. Subtracting these gives 1.6 volts to be dropped by the dropping resistor.
The next step is to divide the dropped voltage by the LED current to get the value of the dropping resistor. If you divide volts by amps, you get the resistor value in ohms. If you divide volts by milliamps, you get the resistor value in kilo-ohms or k.
Example: 5 volt supply, 3.4 volt LED, 12 milliamps. Divide 1.6 by .012. This gives 133.33 ohms. The nearest standard resistor value is 150 ohms.
If you want to operate the 3.4 volt LED from a 5 volt power supply at the LED's "typical" current of 20 mA, then 1.6 divided by .02 yields a resistor value of 80 ohms. The next highest standard value is 82 ohms, but 100 ohms will be more common.
If you want to run a typical 3.4 volt LED from a 5 volt supply at its maximum rated current of 30 mA, then divide 1.6 by .03. This indicates 53 ohms. The next higher popular standard resistor value is 56 ohms. Please beware that the 30 mA rating for 3.4-3.5 volt LEDs may be optimistic.
One more thing to do is to check the resistor wattage. Multiply the dropped voltage by the LED current to get the wattage being dissipated in the resistor. Example: 1.6 volts times .03 amp (30 milliamps) is .048 watt. For good reliability, it is recommended not exceeding 60 percent of the wattage rating of the resistor. A 1/4 watt resistor can easily handle .048 watt. In case you need a more powerful resistor, there are 1/2 watt resistors widely available in the popular values.
(Adapted from LEDs 101 at
http://misty.com/people/don/ledd.html)
--Chris