I have the following problem and I wonder if somebody can give me some clue.
On my electronic board I have an AC/DC converter with the following specifications.
Max. input voltage = 277 volt
Max. input current = 70-80 mA
Freq. = 60 HZ
However my input voltages vary from 120 to 600 volt.
What comes to my mind is to have a self adjuster circuit to detects the voltages below the 277 volt and let them pass through and adjust the higher voltage. For such purpose I believe that I need a step down transformer or a clamper circuit. But I have no idea if this is feasible or not!
On the other hand, I may better to have 2 voltage converters. (As I searched in the market there is no AC/DC converter in a wide range of 0-600 volt) One gives 90-290 volt and the other 300-600 volt. However, I need to have a device maybe a relay or so to detects if the voltage is below 277 or not.
In total, what do you think and in your opinion how do you guys solve such problem?
Sounds like you are trying to design a power supply that works around the world from any input (90V is the low end for Japan, and 660 the high end in Canada). It’s actually quite tricky. What you are really trying to design is a 660V input supply that works down to 90V. The component and trace spacings and part ratings must be spec’d for the high end. I just did a 347V-in supply, and needed 900V FETs.
Please note that there’s enough energy in the AC line to kill or vaporize you, your product, and your test equipment with the slightest mistake. Don’t take this the wrong way, but it doesn’t sound like you have the experience to safely design this.
I’d get a listed power supply that has already been safely designed, tested, and has passed UL/CSA/TUV or other NRTL testing. If you need to go with two versions of the device for different power ranges, I’d stock two separate versions or have some obvious wiring differences between the two. Remember that most input connectors, fuses, fuse holders, switches, and wire is NOT rated for 660V (often they max out at 250V…)