Teensy 3.5 ADC random noisy synchronized readout

Status
Not open for further replies.

clappertown

New member
Hi,
I am using Teensy 3.5's two ADCs to sample 2 analog channels: A1 and A2 in a loop. The ADC_init function is written below:

unsigned long long A1_SUM;
unsigned long long A2_SUM;
float A1_AVG;
float A2_AVG;

void ADC_INIT( )
{

ADC_RESOLUTION = 16;
adc->setReference(ADC_REFERENCE::REF_EXT, ADC_0); // I am using external 3.3V ref for both ADCs
adc->setReference(ADC_REFERENCE::REF_EXT, ADC_1);

adc->setResolution(16, ADC_0);
adc->setResolution(16, ADC_1);

adc->setAveraging(32, ADC_0); //32 times internal average for both ADCs
adc->setAveraging(32, ADC_1);

adc->setConversionSpeed(ADC_CONVERSION_SPEED::VERY_HIGH_SPEED, ADC_0);
adc->setConversionSpeed(ADC_CONVERSION_SPEED::VERY_HIGH_SPEED, ADC_1);

adc->setSamplingSpeed(ADC_SAMPLING_SPEED::VERY_HIGH_SPEED, ADC_0);
adc->setSamplingSpeed(ADC_SAMPLING_SPEED::VERY_HIGH_SPEED, ADC_1);

adc->startSynchronizedContinuous(A1, A2);
g_ad_counter = 0;
A1_SUM = A2_SUM = 0;
}

The loop function contains below READ_ADC_CHANNELS():

void READ_ADC_CHANNELS()
{

while (!adc->isComplete(ADC_0) || !adc->isComplete(ADC_1));

result = adc->readSynchronizedContinuous();

A1_SUM += (uint16_t)result.result_adc0;
A2_SUM += (uint16_t)result.result_adc1;

g_ad_counter++;

if (g_ad_counter==320) // 320 external time average
{
g_ad_counter = 0;
A1_AVG = A1_SUM/320.0;
A2_AVG = A2_SUM/320.0;
A1_SUM = A2_SUM = 0;
}
}

As a result, A1_AVG and A2_AVG full range are 0 - 65535, and I am hoping after heavily oversampling, 32 average(internal) x 320 average (external) = 10240 times, I can get rid of most of the ADC input noise.

Interestingly, ~30% of the chance after the Teensy 3.5 is reboot every time , when A1 and A2 inputs are 0V, if I read A1_AVG and A2_AVG values multiple times, e.g., x100, the sigma are large, around 40 --> 40/65535 = 0.06%. If I keep rebooting the Teensy 3.5, for the remaining 70% chance, the sigma can be reduced to ~ 1/65535 = 0.00015% ---> assume the 3 sigma noise ~ 4, it seems I can still get effective ~14bit ADC resolution ---> 3.3V/16384 = 0.2mV.

I do have RC low-pass filters ~10Hz + low output impedance voltage buffer on the A1 and A2 inputs and I got perfect linear mapping from 0 - 3.3V to [0,65535].

So my question is that if the ADC initialization steps are correct in ADC_INIT( )?
 
I think, very_ high_speed and 16 bit resolution is not recomendet together.
14 bit result is very good with buffer and filter and oversampling.

I use startSycronisedReadSingle() with T3.6 for sampling 6 pairs of current and voltage with 2048 samples each in 100msec for optimal supression of 50/60 Hz noise . This is a 5usec Isr every 61usec ( two slicers for other things )

I get 12bit results with less than 1% noise @ FS/2 with medium_speed sample and convert.

Perhaps you can get better result with slower sampling and converting.
 
Status
Not open for further replies.
Back
Top