Teensy 3.1 ADC accuracy(repeatability)

Status
Not open for further replies.

awwende

Member
I'm working on a project which requires that I get repeatable adc readings to +/-5mV. I do call analogReadResolution(12) which should give me accuracy down to ~0.8mV. What I'm seeing though is up to 30mV of error.

Teensy is powered from a 9VDC wall wart -> 7805 -> Teensy Vin, and in software I use the measured Vin value instead of assuming it's exactly 3.3V.

I know with software I can correct for the inaccuracy, but that would require good repeatability. When I cycle power to the teensy, the measurement can change up to 50mV. This leads me to think the supply voltage is changing enough to screw up my measurements.

I came across this code for arduino and teensy 2.0 (assuming it will work on the 2.0 since they are all Atmel chips). http://hacking.majenko.co.uk/making-accurate-adc-readings-on-arduino

Would there be a way to get this code to work with the Teensy 3.0/3.1?
 
You probably should use analogReference(INTERNAL).

The internal reference is 1.2V, so the full scale range becomes 1.2V instead of 3.3V.
 
I could set the internal reference for lower voltages, but most of my readings are over 1.2V. How do I go about changing the average? Also, what's the default, 1?
 
I could set the internal reference for lower voltages, but most of my readings are over 1.2V. How do I go about changing the average? Also, what's the default, 1?

Use a http://en.wikipedia.org/wiki/Voltage_divider to read voltages that are bigger than your reference, with a reference of 1.2V trying to read up to 12 volts you can put a 10k resistor between GND and the analog pin you are going to read, then a 100k resistor from that pin to the voltage you are measuring; Then the formula is

Vmeasured = (((AnalogRead(pin) / AnalogResolution) * Vref) / Resistor_to_GND) * (Resistor_to_GND + Resistor_to_Vmeasuring);

Assuming similar to these settings;
Code:
analogReference(INTERNAL);
analogReadAveraging(32); // default will be noted somewhere, I haven't looked for it, soz.
analogReadRes(12); // 0-4095

Vm = (((AnalogRead(pin)/4096)*1.2)/10000)*110000; // or divide 4096 by 1.2 for
Vm = ((AnalogRead(pin)/3413.3334)/10000)*110000; // or multiply 3413.3333' by 10000 for
Vm = (AnalogRead(pin)/34133333.3334)*110000; // or divide 34133333.3333' by 110000 for
Vm = AnalogRead(pin)/310.30303; // or take the inverse of 310.30303030303030303030303030'' for
Vm = AnalogRead(pin)*0.00322265625; // Finally, this line is as accurate as the first line, middle three are slightly off due rounding.

Accurate resistors are called for but normal 1% ones suffice, verifying the resistors and using their actual measured values improves accuracy. Using 10k+100k makes it possible to read up to (nearly) 13.2 Volts.

HTH.
 
Last edited:
Just because your calculator can do 10 digits doesn't mean that the real-world circuit will !

The 1.2 V reference on the ADC is trimmed at rom temperature quite accurately, but varies a few % over temperature.

Also, if you are using multiple input channels, if you use a resistor divider, there could be some small interactions between them if you sample at the higher rates.
 
If the compiler isn't rounding my number as adequately as I could then the compiler needs way more attention than it has been getting. 'float' rounding effects every step of each iteration of the formula that I posted, the last iteration of the line I provide is as near to 'actually accurate' as the first line where the middle three iterations are definitely less accurate and that last line will execute faster - to avoid 'float' rounding causing (very small, negligible in less sophisticated applications - the majority imho) inaccuracy you would need hyper accurate resistors with values calculated to cause the maximum significant digits (both sides of the decimal point) to be seven in the application of multiplication and division in each step of the formula.

Calibration is always called for in designs requiring higher accuracy.

None of the (actual/certified) electronics engineers I work with have complained about the level of accuracy my code managed using their resistor networks/voltage dividers (Nor the clients the devices were made for) to this point, most of them were verified with relatively expensive measuring equipment reading awfully close to my derived measurement.

Multiple channels using resistor networks with as little crosstalk as the 'reading device' manages is hardly impossible.
 
For analog interfaces you may also want to look at the temperature specification for the equipment and the effect this has on the potential change in Vref and resistors (tcr - temperature coeffient of resistance).
The simple way is to request the hardware engineer determine the accuracy the hardware will support across the products temperature range .
If you are the engineering authority (time to power up the slide rule oops I meant spreadsheet ) then a rule of thumb
If you have plenty of available accuracy - if you're calculated accuracy is only 8bits (and using a 12bit ADC) then you probably don't have to calculate it,
If the temperature accuracy is room temperature 0-20C and your calculated accuracy is 10bits you're probably OK too
If it is industrial/outdoors and -10C to 80C and 10 or 11bits accuracy required and you are producing a number of devices then my experience is its easier to put the sweat into calculating accuracy ranges across temperature and selecting 0.1%Resistors with a low temperature coefficient change.
I do it with a spread sheet.
 
Last edited:
Status
Not open for further replies.
Back
Top