On the 3.5/3.6 Kickstarter page, why are the ADCs 13-bit when they are 16-bit

Status
Not open for further replies.

SharpEars

Well-known member
From the page:

...
  • 62 I/O Pins (42 breadboard friendly)
  • 25 Analog Inputs to 2 ADCs with 13 bits resolution (Should be 2 SAR ADCs with 16-bits of resolution)
  • 2 Analog Outputs (DACs) with 12 bit resolution
...
 
The low 3 bits are always noise, not matter how good your analog circuitry.

I wanted to give everyone an honest description of the hardware's capability, not a misleading 16 bits, but then later when you try to use it the bottom 3 bits are always noise.
 
The hardware supports 16 bit but useful bit depth with careful design is 13 bits. The kickstarter only advertises the bits that will not be random numbers and even there you have to be pretty careful to achieve that.

Edit - to slow
 
Last edited:
The low 3 bits are always noise, not matter how good your analog circuitry.

I wanted to give everyone an honest description of the hardware's capability, not a misleading 16 bits, but then later when you try to use it the bottom 3 bits are always noise.

Is this true even with multisampling and averaging? I seem to be able to get close to 15 ENOB on the Teensy 3.2 with enough oversampling (e.g. >=1024), even on a breadboard and using the crappy (IMHO) 1.2v internal reference, but when battery powered via a 9V through an LM317 to produce 5V (and not USB power, which is the noisiest power source on the planet).
 
Last edited:
Hm, a question, i'm not expert at this, but want to know .)

Isn't it the opposite: The AD-Converter is powered through the same source as the internal reference - so if there is noise, it will cancel itself ? (i.e. the internal voltage rises -> ad reference rises (x-x=0) ) Isn't the overall noise higher when then internal ADC samples the additional noise from the external reference ?

It's likely, that i'm wrong, but what say the experts ?
 
Hm, a question, i'm not expert at this, but want to know .)

Isn't it the opposite: The AD-Converter is powered through the same source as the internal reference - so if there is noise, it will cancel itself ? (i.e. the internal voltage rises -> ad reference rises (x-x=0) ) Isn't the overall noise higher when then internal ADC samples the additional noise from the external reference ?

It's likely, that i'm wrong, but what say the experts ?

It's definitely not like this, because, in a nutshell, the underlying circuitry is different, including protection of each circuit from the input noise (i.e., it's effective PSRR [Power Supply Rejection Ratio]). Now without a doubt, a more noisy supply will result in more noise entering both (sub)circuits than a less noisy one. But the characteristics of the noise and its effects are different. For example, in a voltage reference, the noise can cause jitter (and sometimes skew) in the output voltage. In an ADC it can cause non-linearity effects resulting in jitter in the digital value produced.
 
teensy 3.2 too?

The low 3 bits are always noise, not matter how good your analog circuitry.

I wanted to give everyone an honest description of the hardware's capability, not a misleading 16 bits, but then later when you try to use it the bottom 3 bits are always noise.

Is this true of the teensy 3.2 as well?
 
Yes, the on-chip ADC hardware isn't anywhere near a perfect 1 LSB of 16 bit noise floor!

Achieving even the best case scenario of 13 bits is quite challenging. ADC pins require low impedance signals. Careful attention needs to be paid to issues like ground loops with power supply currents. Good analog circuitry and system design is critical to achieve even 13 bits.

In these modern times of deceptive and overhyped technical specs, like 24 bit audio and 20 megapixel cameras, 13 bits may not sound like a lot, but indeed it is quite fine resolution that requires very good analog design.
 
Status
Not open for further replies.
Back
Top