I’m trying to get my head around using a low side driver with a large string of LEDs. And failing.
Let’s say there are 20 LEDs in series, each with Vf=2.5V. So the string needs at least 50V to light. Assume a 55V PSU, and a series resistor to limit current - no clever constant current drivers.
I would have thought that a low side switch needs to be able to cope with no more than 5V across it. Is that right?
My reasoning is that the LED string will always drop 50V, and so no more than 5V can appear across the switch. If the load was resistive (e.g. an indicator lamp), then I can see that the switch would need to be able to withstand 55V (since the switch has massive resistance in this state, while the lamp has negligible, Ohm’s law states the vast majority of the applied voltage would appear across the switch’s massive resistance). But with LEDs, surely the voltage is capped by the sum of Vf?
The background to this question is that I need to PWM drive 120 series LED strings, at 50V/ string. Most of the high channel count constant current PWM drivers (e.g. TLC5940) state fairly low voltages (e.g. 17V) as being the maximum on the LED outputs. I’m trying to decide if they mean the maximum supply voltage into the LED string can be no more than 17V (if so, why, assuming my thoughts above are correct?), or do they mean that the LED supply voltage must be no more than 17V greater than the forward voltage of the string?
If anyone can confirm or deny my thinking, I’d appreciate the insight!
Thanks,
Tom