Depends on the LED's. If they are 5v LED's, use 5v. If 12v, use 12v. Normally, some have resistors in line to make them for a certain voltage.
There really isn't 5V LED or 12V LED per se. LED's have a forward Voltage which is calculated along with a resistor value. The same LED can be used in either a 5V or 12V circuit simply by adjusting the resistor value. To use the formula from GGG it's
Resistor Value = (Available Voltage - LED Voltage)/LED Current
So a single 3.3Vf 20mA LED on 5V would require 100 ohm, same configuration on 12V would require something on the order of 470 ohms
Two LED's in serial on 5v wouldn't function correctly, even if you dropped the resistor. Would work fine if you wired them parallel. So on and so forth.
I'm sure someone else who actually went to school on this will chime in soon enough but this site will explain the basics:
http://www.kpsec.freeuk.com/components/led.htm@ D_Harris
Unless I purchased some LED's with a Vf that exceeds my source, the LED's rarely determines what voltage I want to run from. It's usually caused by the number of LED's I want to drive or what kind of restrictions I'm stuck with eg I'm sourcing power strictly from USB and have no other means of driving the LEDs. More commonly, it's just restricted by what parts I have available.