Teensy 16bit ADC

Status
Not open for further replies.
Indeed there is a general signal processing theory about oversampling by simple averaging. If the signal has random uncorrelated white noise (and no other sources of error), you gain approximately 1 bit of additional resolution for every factor of 4 samples you average together.

Here's a very simple program you can run to put that theory to the test.

Code:
const unsigned int bits = 4;  // number of bits resolution to theoretically add

void setup() {
  analogReadResolution(12);
}

void loop() {
  unsigned int sum = 0;
  const unsigned int count = powf(bits, 4);
  for (int n=0; n < count; n++) {
    sum += analogRead(A0);
  }
  sum /= (unsigned int)powf(bits, 2);
  Serial.println(sum);
}

I'm running it right now on a Teensy 3.2 with this hardware. It does give surprisingly good results, at least with this very simple setup where nothing else is connected. But you don't get a "nice" zero or full scale max.

DSC_0972_web.jpg
 
> If the signal has random uncorrelated white noise

Which based on my tests (good results to a point and then it gets no better), it isn't. But it was a different teensy setup. so I'm curious what ENOB you get to.
 
Just out of curiosity I dug out a PCB with a Linear Technology LTDC1867L and wired it up to a T4.1. I collected data at 1000 samples per second to check noise levels.

Test Setup: T4.1 connected to LT1867 reading voltage from single NIMH cell.

LTC1867 is good for 175KSamples/second max and the input range is 0 to 2.5V, so one LSB is about 38.1 microvolts

I collected about 155 seconds of data to an SD card, plugged the card into my PC and analyzed the results with Matlab:

min 1.3719V
mean 1.3722V
max 1.3725V

Std. Deviation 55.9 microvolts or about 1.5LSBs.

It seems possible to get pretty good results with an external ADC.

Here's a picture of the setup:
ext_adc.jpg
 
That's similar to the 15.5 bits I got with a good external ADC running from a battery.
 
Teensy 4.x are more powerful but Teensy 3.x have lower noise ADCs, due to the usual design tradeoffs about processor speed and high-current switching noise.

When I first got a Teensy 3.0 this forum didn't yet exist, or at least I didn't know about it, so I posted my test ADC code and results here: http://dangerousprototypes.com/forum/index.php?topic=4606.0
Code:
Running the below code, with a sample size of 10000 readings, I get a standard deviation of just about 1 LSB 
(so RMS noise = 50 uV) and peak-peak noise of 7 counts, and a DC offset from the expected reading of 10-20 LSBs (= 0.5 to 1 mV).
So that got pretty close to a real 16-bit ADC at least for the RMS noise level, in that specific case with that circuit. I'm not sure about the DC offset though.

When I want clean ADC readings at very low speeds, I usually reach for a Microchip MCP3424 which actually achieves 18 bits resolution, but only at 4 Hz. It can also do 16 bits at 15 Hz. It has differential inputs, which are preferred for DC accuracy. You can get this chip on a cheap breakout board from the usual online sources. I've also played with true 24-bit ADCs around 200 nV rms noise, but it takes some careful circuit work to reach their potential.
 
At low speeds 1/f noise and temperature drift can be issues - a good voltage reference is advised, for
instance see the reference driver circuit in the ADS8885 datasheet application section:
https://www.ti.com/lit/ds/symlink/ads8885.pdf
Since that's a fast ADC the reference has to be buffered for AC performance too.

Basically the datasheet performance of an ADC will be that measured under ideal conditions, with a top
notch reference voltage source, so neglect it and the performance will drop.
 
Agreed. If you expect to calibrate and then get 16 bit accurate readings a week later, then you need to test over a similar time period and variations in conditions. Don't take a msec of data and then conclude "it's accurate to X bits". Similar even for much shorter time periods (like picking up 60Hz noise).
 
I did some testing and on a T3.1, I get up to 3 bits (as in 13 bit accurate) of random offset error between reboots, which no amount of oversampling has an effect on. Seems to be related to noise in the internal ADC calibration process. Any ideas how to improve this?

I also see 4 bit variations in offset depending on how much delay there is between calls to analogRead(). But this is controllable.
 
There has been some interest in this question. I setup a test with a Teensy_LC where I supplied an external DC signal, and monitored it with both the Teensy and an Agilent 34410A DVM using USB for the Teensy and the Ethernet on the DVM. It turns out that (especially at 16 bit), noise in the environment (and noise coupled from the 34410A !) gave quite high standard deviations of the measurements. I didn't have time to filter the noise, so instead I use the onboard DAC -- connected to the ADC (A0) with a 10 kΩ and a decoupled with a 700 nF to GND. Since I was not interested in DC accuracy, just noise, this should give the minimum achievable noise.

The results show that noise (in LSBs) grows from 12 to 16 bits resolution -- as expected. STDEV is about 5 LSBs at 16 bits, halving for each reduction in bit count (although something strange seems to happen at 13 bits) -- so 2.4 LSBs for 15 bits etc.

Of note also is that the STDEV also increases as I increase the DAC code from 128 to 3840 (I didn't analyze DAC codes of 0 and 4095 because the ADC may 'bottom out' and the statistics are not comparable). Because the STDEV does increase with input, this indicates that the source of noise is not dominated by the ADC input, but by the ADC reference. For example, when the ADC is converting (say) 1/10 of full scale, it takes a certain amount of noise on the reference to cause some effective LSB noise on the ADC conversion. Then for an input at 90 % of full scale, the same amount of reference noise will cause 9x the noise on the ADC conversion. The results confirm this effect.

In summary, most of the noise in Teensy LC's ADC appears to come from the reference, not the ADC input. Averaging (gain ~ 1 bit per 4 average) will generally work unless you are digitizing a signal that is correlated with the Teensy's noise. Because of quantization noise, averaging is generally better at higher ADC conversion resolution.
 

Attachments

  • ADC2.ino
    2 KB · Views: 62
  • Teensy.txt
    55.7 KB · Views: 69
An external 16bit ADC I've used has a standard deviation of .5 count and a peak-to-peak of 1 count - with no averaging. With about 16000x oversampling over short periods, I got there with a standalone Teensy 3.1. But it still has a big (like 40 counts) offset and drift problems, so not what I would call 16 bit performance. No idea what linearity is like - probably not close to the external ADC.
 
In some more testing, Teensy 4 ADC performs better than Teensy 3. At 4096x oversampling with median (not mean), short term, stable temperature results are good.
 
Status
Not open for further replies.
Back
Top