So I’ve got a project I’ve been trying to get off the breadboard for awhile but I haven’t been able to.
Project in its simplest form;
Arduino Uno R3
Anmbest Interfacing Voltage Sensor
Using these two items, I can read voltage of a battery properly. The issue is when I try and use an external power source (A battery pack) to power the setup.
Issue: Readings are jumping and inconsistent.
Example;
9V battery is connected to the terminals example of readings 10 readings;
9.69
9.69
9.68
9.69
9.68
9.68
9.69
9.68
9.69
9.68
Example of the same 9V battery powered via an external battery pack.
9.72
9.67
9.70
9.70
9.68
9.65
9.70
9.75
9.68
9.65
My first concern was that the battery pack could not supply a proper regulated 5V reference. Using a multimeter I determined this is not the case, I never dropped below 5.08V incoming.
The pack in question i don’t recall the exact specifics, but its a 10000mAH battery pack, a beastly fellow.
Any help would be much appreciated.
P.S. I also have tried using a buck converter (power regulator) inbetween the battery pack and the Uno but I found that my voltage out going fluctuated GREATLY. On the IN side of the buck converter I was getting a steady 5.08V. On the OUT of it I was getting readings like;
5.04
5.02
5.04
5.04
5.02
Ect. ect. Constantly changing, never steady. Would this effect my readings as well? Or so long as its got a minimum of 5V am I good?
Thanks!
Hey there, thanks for joining us on the forums!
I’ve still got a few questions about your setup. Can you share a link to the voltage sensor you are using and also share a schematic of your circuit?
Also, what is the power supply that you use when not using a battery pack?
There are a few things that I think could be affecting this:
-
Poor grounding between components, since after all any voltage measurement is a differential measurement
-
Voltage drop through the R3’s voltage regulator.
I also don’t know what is a reasonable expectation. I see the relatively stable output at 9.6(9/8)V but don’t think that an approx. 0.1 V range is that bad either. What’s your application? How precise/accurate do you need to be?
A little more info would help us get to the root of this… Thanks!
Thank’s for the reply!
Voltage Sensors:
https://www.amazon.com/Voltage-Measurem … 142&sr=8-5
I’ve never done a schematic before sadly, so not quite sure where to start.
I have a bit more complex workings than just what I described but I simplified it for a shorter post
So this is what I started with:
That sensor, an Arduino Uno R3 and a serial cable.
This is the tutorial I followed
https://www.electronicshub.org/interfac … h-arduino/
And it worked fine, but I could not get the precision I required. I then found out about ADC modules. I landed on an ADS1115 16bit ADC module.
This was able to give me the resolution I needed. I used this tutorial;
http://henrysbench.capnfatz.com/henrys- … -tutorial/
But added in my voltage sensor on my A0 & A1 connections. This works just fine, while connected via the Serial Cable to a desktop PC reading output on Serial Monitor.
I do not believe it is a grounding issue, I have maliciously resoldered all connections.
I need to be as scientifically accurate as I can be. I work in the industrial battery industry rebuilding batteries and stress testing them. So I have to know the voltage of each cell of the battery.
The issue isn’t the 0.01V changes, it actually goes a bit further than that. I can take that same setup, change nothing and power it on the next day and my readings are changed by ± 0.05V.
Now I imagine that can be attributed to the voltage decay of the battery but so long as it hasn’t dropped below 5V I don’t see how that plays a factor.