Utterly confused about LED drive voltage.

A 7 segment display I’m using has a typical forward voltage of 2v and a maximum of 2.5v with a 30mA forward current. I’m using a 5v supply. Everything that I have read says to take the difference between the supply and forward voltage and divide by the forward current to get the value for the current limiting resistor. The voltage out of the resistor is 5v. Now, I do understand that the resistor is limiting the current, but I am still driving the LED at 2.5v above its maximum voltage rating. How is this OK?

Clearly, there is something that I am not understanding.

The voltage out of the resistor is 5v.

This is the misconception. The 5 volt is going into the resistor. (if it is on top of the led) What is at the end, and going into the led (if it is at the bottom), is less than that. The drop across the resistor is current times resistance= voltage. A properly sized resistor is (5 -2 )/0.03=100 ohm. So atleast 3 volt is dropped across the resistor, and the 2 volt as forward voltage across the led element of the display.

If you use a lower resistance for the resistor then the led is forced to increase it’s forward voltage due to more current passing through it. And will burn brighter. At 2.5v it will start to break down. You never want to choose the resistor for that forward voltage. A broken LED is of no use.

In a basic led-resistor circuit it doesn’t matter if the resistor or led is on top. But a 7 segment led display is specific in that it has multiple leds in parallel with a common pin. Which might be the anodes, or cathodes of the leds. So the position of the resistor determines if it limits the current for just one individual led element, or for all of them when it is connected to the common pin.

So, why do I read 5v across the resistor with a voltmeter?

You must have something wrong with the circuit.

5 volts into the resistor and testing on the other side. That’s it. 5v source->resistor->voltmeter. I was curious, so I did this to be sure. I’m getting 5 volts out.

MarkS:
… 5v source->resistor->voltmeter.

There is no mention of a led in there. So, show us a picture of your circuit . It is unclear what you are doing now with the 7 segment display led. And a datasheet of the 7 segment display that you use. Because depending on which side of led has a higher voltage than the other, no current will flow, and then [EDIT]the voltage between resistor and ground might be the full 5 volt.

A voltmeter needs to measure the voltage difference between to points. It is to be used in parallel to the part you want to measure the voltage across it. Often you have the black/common/minus test point connected to the minus or ground of the power source. Then you measure all voltages with a common reference point. The setup you described here suggests that the voltmeter is used in series with the resistor (only, no led in the circuit). This is only usefull if you want to measure current.

I suggest you take a look at Sparkfun’s tutorial site, and read the multimeter tutorial.

http://en.wikipedia.org/wiki/Ohm%27s_law

(click on to open)

Thank you! I knew I was missing something. I just tested it and I now get 1.9 volts.

This is what I was doing (see pic). I removed the LED from the circuit and replaced it with a voltmeter to check for the voltage drop. This still leaves me a little confused. There has to be current flowing through the meter for it to read the voltage. Why no drop?

MarkS:
There has to be current flowing through the meter for it to read the voltage. Why no drop?

There is current flowing into the meter but, due to the meter's high resistance, it's very very small. You'd need a precision DVM to read it. Same thing as Case2 above.

Again, thank you!