Beginner question for tutorial #1

Sorry if this is so basic. I’ve done the first few tutorials already, but I still have a simple electronics question for #1 that I can’t seem to find an easily understandable explanation. If my voltage regulator is 3.3V and the power status LED I have has a 3.3V rating, why do I still need a resistor for the LED? When I ran a test on my breadboard with no resistor my 9V battery overheated and died within minutes, but with a 220R it did better.

Can someone give an easy to understand explanation? I want to keep the LED to its recommended voltage, but I don’t want it to interfere with my power supply either. Thanks!

http://www.sparkfun.com/commerce/tutori … ials_id=57

An LED should have a forward voltage and current rating in the data sheet. What are the values for the one you are using.

Leon

Thanks, its 3.3V and 20mA forward current.

3.3V doesn’t sound right. Do you have a link to the data sheet?

Leon

Never ever supply an LED directly from DC. An LED is a current based device! The voltage rating of an LED is the forward voltage drop of the LED. It is basically a diode. You must use a series resistor or an LED driver IC to limit the current of an LED. The only situation where this isn’t true is some LEDs have built in limiting resistors.

I don’t have the sheet handy but I know the specs. I was just asking for a good explanation on if my battery was voltage regulated to the exact forward voltage of an LED, say 3.3V, why do I still need a resistor for the LED? If I do need one does that mean my LED will have to be dim? Sorry if I’m not phrasing it correctly.

LEDs don’t have a forward voltage of 3.3V, they are usually about 1.8V and need a series resistor. Use Ohm’s Law to calculate the value.

Leon

Blue LEDs can have a voltage of around 3.3V or higher. I have found that they will still work well with a small value resistor. (50 to 100 ohms)

The problem is that the current through the LED changes with respect to the voltage across it according to a very steep slope. e.g. at 3.3V it might be 10mA but at 3.2V it could be only 1mA and at 3.4V, 100mA.

Each individual LED will have a certain tolerance on its forward voltage, so although the nominal value may be 3.3V, the actual value could vary by 5%. The voltage regulator you are using will have some tolerance on the output too. Furthermore, factors such as temperature cause the forward voltage and the supply voltage to drift over time. Combined, these variations lead to an unpredictable (and potentially damaging) current in the LED.

Using a series resistor solves the problem by making the overall combination of LED and resistor have a much flatter I-V characteristic, so that small changes in voltage result in only small changes in current.

Thanks, I will try using a low resistor.