curiouslystrong:
Voltage divider is probably the cheapest, but I was reading that different load draws can make it not work the way its supposed to.
Why can’t I just use one resistor like the way it’s done for LEDs?
The answer to "why" to both of the questions above is the same. A resistor with current flowing through it will have a voltage across it per [[Ohm's law](http://en.wikipedia.org/wiki/Ohm%27s_law). That is ;
Vr = Ir x R ; where Vr is the voltage across the resistor, Ir is the current through it and R is the resistance in ohms.
So if you start off with 5v and flow some current through the resistor, the voltage on the “other side” of the resistor is less, per the above. But if the current varies, the “output” voltage will also vary, and that’s not what you want to power a device. In the case of an LED or an unloaded voltage divider the current remains the same once you’ve set the resistor value(s).
FWIW you can use a Zener diode and a resistor as a simple voltage regulator but such a technique is limited in the range of current it will work over and I’m not sure it’s all that much less $s than a simple 780x regulator.
jremington, thank you for the suggestion. I was just hoping for a cheaper solution (<$1), but this would be the safest. I wanted to limit the amount of components I had on my board.
Mee_n_Mac, that was a great explanation, and gave me a better understanding of how to use the three. The current draw from the load will affect the voltage, and will be unreliable.
So, in cases where the current draw varies, a voltage regulator is essential. In cases where current draw wouldn’t vary, I could use a voltage divider or a resistor?
Just to confirm what I understood…
Let’s say I’m trying to use this solid state relay. There doesn’t seem to be a max voltage rating on it, but there is a 1 mA max for the control signal. In this case, I can treat it the same way in the LED circuit. From my 5V voltage supply, can I use a 5k resistor on the control line and expect current to be limited to what it should be?
In fact the SSR above is an opto-coupled device and so it’s control input is an LED w/a Vf of 1.2v typ, 1.4v max (@ 5 mA). The recommended input current is 3 mA min (at high temps), no max given (the graphs hint at 50 mA) but I’d use 20 mA. So treat it just like diving an LED, w/a resistor that allows at least 3 ma but no more than 20 mA. Given the turn on time graph, I’d aim for 10-15 mA if drive current isn’t limited.
I read somewhere that when using Linear Voltage Regulators, you have to have a voltage supply that provides more than what you want to step it down to. I want to step up my voltage after a diode (5v → 4.3v → 5v; trying to avoid using a switching regulator).
However, after going through the regulator’s datasheet, the absolute ranges are from -0.3 to 8.5 V. Should I trust the datasheet?
You are missing the dropout voltage. Input must be at least the output plus dropout voltage for the regulator to regulate. Below that voltage, the output will be less. Absolute ranges are the limit for safety, outside that range the part may be damaged.
curiouslystrong:
I want to step up my voltage after a diode (5v → 4.3v → 5v; trying to avoid using a switching regulator).
As said above with a linear regulator you always reduce the voltage, output vs input. If you want to increase the voltage you need a boost converter which is always some form of a switching regulator.
But if I may ask … why would you want to boost the voltage ? You have 5V available, what stops you from using that ? Perhaps you could give us the “big picture” as to what it is you’re trying to do. I suspect there may be an easier, more conventional way of accomplishing the desired end goal than what I’m getting from your questions.