I will try to sort this out. If it gets to technical, please let me know and I'll do a second try

First of all, what matters when it comes to LEDs is the current.
As you can see in the specifications (IF / Continuous Forward Current) it is rated to 35mA. By the way, 35mA = 0,035 A.
Another value that is critical here is the voltage (VF / Forward Voltage) that is 3,3V.
I Assume that you have a ordinary PSU with 12V, and since you talked about multiple LEDs, I think it is a better idea to go with that, not the 5V.
We can have an 5V example also later on.The normal way is always try to keep the current as low as possible, and in the case of LEDs that means connecting in series - not paralell.
The current will always be 35mA no matter how many LED's you put there. The only thing that will change is the voltages.
If we have 3 LEDs in series, which all have a voltage drop of 3.3V, then it will be 2.1V left over the resistor (3,3+3,3+3,3 = 9,9 -12 = 2,1).
Now we end up at Ohms law - which says that U=I*R (U= voltage, I= current & R=resistance).
Since we know U and I here, we can easily determine what resistor we need. R = U/I - 2,1/0,035 = 60.
Resistors are not made in any size, so the closest value there is in the standard series (E12) is 68ohm. That is close enough.
For resistors you also need to know the power used, or else you can blow it up. The power, W, you get by multiplying the current with the voltage.
W=U*I - 2,1*0,035 = 0,0735. That says you can safely use a standard 1/4W resistor. 1/4W = 0,25W