How to use full ADC resolution with voltage divider

Status
Not open for further replies.

Experimentalist

Well-known member
I want to use a voltage divider to read the voltage from a 5 cell Lipo. However I want to use the full resolution of the ADC with analogRead() but across a limited voltage range. The maximum voltage being 4.2V per cell and although I do not want to discharge past 3.0V per cell I would like to range it to read down to 2.7V per cell. If I use a voltage divider with a 5K6 tied to VCC (Max = 5 x 4.2 = 21V, min = 5 x 2.7 = 13.5V) in series with a 1K I should get 3.18 or thereabouts when the battery is fully charged (21V) and 2.05V when the battery was at 13.5V. Is there a mechanism I can use to offset this so that I can use the full ADC resolution between these 2 voltages? As I see it using the Teensy LC with 12 bit ADC I have 3.3V/4096 = 0.8mV resolution. 2.05V/3.3V x 4096 = 2544 to 3.18V/3.3V x 4096 = 3947. So as I see it I am using 3947-2544 = 1403 of the available 4096 resolution. Can I somehow offset this so that I can use the full 12 bit resolution to read from 2.7V to 3.18V coming from my voltage divider?
 
To offset your voltage you can use where your would use a resistive divider on the non inverting input to set your range midpoint and the ratio of the input resistor and gain resistor to adjust gain to maximise the ADC range. While you can math out the various values would suggest either building a simulation or building out a prototype with variable Bias and gain control resistors to get a handle on how this all works. In picking an op-amp make sure you get one that can run down to 5V, and does not need a negative supply.
 
To offset your voltage you can use where your would use a resistive divider on the non inverting input to set your range midpoint and the ratio of the input resistor and gain resistor to adjust gain to maximise the ADC range. While you can math out the various values would suggest either building a simulation or building out a prototype with variable Bias and gain control resistors to get a handle on how this all works. In picking an op-amp make sure you get one that can run down to 5V, and does not need a negative supply.

Thanks for your response. So. if I'm reading this correctly, I would use a reference voltage and a voltage divider to provide a fixed 2.7V into an Op Amp non inverting input. I would then connect the centre point of my voltage divider as described above to the inverting input and the output would be the difference of the 2. I would then select bias and gain control resistors to amplify the output to provide say 3.2V (I am guessing I would want to leave a little to avoid any risk of going over 3.3V) when the batteries were fully charged? Therefore as I understand it the Op Amp would then provide an output of 0 to 3.2V when the input went from 2.7V to 3.18V, have I understood this correctly?
 
That should be how it all works, the bias voltage/reference may end up being half way through your range. Depends on internals of the amp and how it is powered. If you are chasing precision you need to make sure the op amp supply and the bias voltage are stable, but suspect for battery monitoring the ordinary supply is fine. The op amp supply will matter. Running it from 3.3V prevents the output going too high but means you need an input voltage divider to drop the input below 3.3V as well, running from 5V gives you more options on the amp itself (many do not run at 3.3) but takes care not to drive the output above 3.3V and still needs a voltage divider. I suspect the lowest leakage current way is to actually run it from the raw battery input, with inverting input also directly connected and get the difference between that and the reference but working out how to get that the gain to produce the right output range is making my head hurt.
 
Status
Not open for further replies.
Back
Top