I just discovered this thread and thought I'd give my 2 methods for finding out what value resistor I need for an unknown voltage LED.
I hope I don't make any stupid math mistakes in calculations since I just woke up, but the method is right even if my numbers came out wrong. I'm doing it as I go.
Method 1:
Since I have a lab bench power supply with adjustable voltage outputs, that's usually my first method - let it be known that you ONLY need a series protection resistor to limit current on the LED if you are going to supply the LED with more voltage than it is rated for. IF you supply the LED with the exact voltage it's rated for, then it automatically draws the proper current that it can handle. As you increase voltage beyond that, there is also more current and eventual destruction.
So if you supply less than rated voltage, nothing happens because there isn't enough voltage to forward bias the diode and it's just open circuit. I start the power supply at 0volts and slowly increase it until the LED starts to turn on. I adjust it until it seems safe, not too dim, not too bright, and then whatever voltage I'm giving it is what I'm going to consider its rated voltage. Say it's 1.8 volts. If I'm going to power from a 12v computer power supply, then the resistor I need is 12v - 1.8v = 10.2v expected across the resistor I add in series. Then I measure the current actually going through the LED with my experimentally determined voltage, I stick a multimeter set to measure current, in series between the LED and one of the power terminals...say it's 10mA. Resistor needed is 10.2v / 10 mA = 1020 ohms so I'd stick a 1K resistor in series with the LED on a 12v supply.
I'd probably want to double check the power ratings of the resistor as I start using higher voltages so I don't burn it up. Most common resistors are 1/4 watt. Power = current squared x resistance, 0.010amps squared times 1000 ohms = 0.1 watts. 1/4 watt is 0.25watts so I have more than twice the power rating. No problem.
Suppose the power calculated grater than my resistor rating, just split the resistance up between several resistors in series, like make up 1K out of two 470 ohms in series to get close to 1K, and twice the power rating for 1/2 watt total. Add another 56 ohms if needed to get closer to 1K using standard component values (996 ohms)
When you add a resistor and a higher-than-rated voltage to the LED, the voltage measured across the LED will remain the same (rated voltage) and the rest of the power supply voltage will show up across the resistor if you happen to measure that. So running a 1.8v LED on 24volts would give 22.2v across the resistor and still 1.8v across the LED. 22.2v / 10 mA = about 2.2K resistor needed in series with the LED. It's only about 0.2 watts, still safe with a 1/4 watt single resistor in that case.
So that's method 1. Set a variable power supply at 0v, slowly turn up voltage until LED is satisfactorily lit, measure the current through the LED, calculate resistor needed with (expected power supply voltage minus LED voltage) divided by measured LED current, and make sure the power rating of the resistor can handle the calculated power dissipated by that resistor at that voltage and current.
Method 2:
If you have the fixed voltage power supply you are going to use, say 12v, and the LED, and you have no idea what resistor to add in series, obtain a variable resistor of a reasonable value - from the above example even with 24volts you'd only need a few K, so don't get a pot rated at 100K or 50K or even 25K maybe, because the "few K" area of the pot is going to be such a short travel near the end, you'll zoom through it too fast and maybe blow the LED. If you get say 5K pot it should be ok to work with. Ignore one of the end terminals on the pot and work with the other 2 as your 2 resistor wires. WHen you adjust the pot you'll vary the resistance of those 2 leads between 0 and full range ohms.
Set the pot dial in the middle, then you'll have half resistance across the 2 terminals without needing to worry about which end-of-travel direction of the dial is going to be full or zero ohms, and accidentally blow the LED in ignorance. So you have a power supply, an LED and the POT 2 leads all connected in series. Power it up. Most likely the LED won't be on. Slowly adjust the pot in either direction until it comes on - if it doesn't come on, you're probably turning the pot the wrong way and increasing resistance, lowering current. Turn it slowly the other way until it comes on. If it still doesn't, maybe the LED is backwards. Reverse it and have the pot set to center again and start over.
When you get the LED to light as desired, disconnect the pot and measure the resistance between those 2 terminals you used. THat's the value you need for a fixed resistor.
If there's any typos, calculation, or logic errors in all that, it's because I just woke up...at least gain some ideas for approaches from it.