What do i have with these 2 LED's and 3 resistors

So I was playing around with some LED’s I had laying around, I’d like to know if this setup is effiecent or if I should change something around.

I have 2 red LED’s 1.5V, 20mA with one 200ohm 1% resistor and 2 120ohm 1% resistors all connected in series…

I know I only need 300ohm for the 2, but didn’t have any smaller resistors to use, so with this setup what is wrong if anything and what can I do to correct it to run and work correctly for a long time…

thank you

What voltage is used in the circuit ?

Currently that setup is running off 9v battery if that is what you were asking…

cubangt:
So I was playing around with some LED’s I had laying around, I’d like to know if this setup is effiecent or if I should change something around.

Your setup, assuming this ties into your other thread is the usual case. No it's not the most efficient in terms of LED output vs power consumed. Let's say you really got 9v from the battery and each LED dropped 1.5v. Then you'd have 3v across both LEDs and 6v across the current limiting resistor(s). So 1/3'rd of the power goes to making light, 2/3'rds go to making heat in the resistor(s).

Using a battery voltage closer to the sum of the LED voltages would help this. As would using a special current driver for LEDs. But the latter is probably overkill for your usage. And the former can have practical problems when the voltage difference between the supply and LEDs is “small”.

Look at the battery voltage vs current drawn from it and vs lifetime.

http://img.weiku.com/waterpicture/2011/ … 9046_1.jpg

http://1.bp.blogspot.com/_OGu0H7TFUno/T … Bcurve.jpg

How would you pick a single resistance value that’s optimal under all those conditions ? You can’t. There’s always some inefficiency.