Just accepted my invite to the T4 Beta1 testers and got the hardware; thanks Paul & Robin!
I tried measuring a slowly changing voltage using the ADC A0 input. My voltage source was 4300 uF of capacitance charged to 3.3V and then slowing discharging through R=11k to ground. Using 12-bit ADC mode, my code averages together 1000 readings and I also set analogReadAveraging(16). I print out the average and standard deviation for each set of 1000. My code is here:
https://github.com/jbeale1/DataAcq/blob/master/TeensyADCTest.ino
Everything worked as expected, except there is one small jump (differential nonlinearity) that is near, but not quite at the midpoint. In the enclosed graph, at sample count 497 the standard deviation of that set of 1000 samples jumped to 6 when all others are near 1.5, and you can see a step-offset around count 2078 in the average value as it slopes down. The exact midpoint would be count 2048 which in this experiment comes 5 samples later, so the jump is not precisely at midpoint.
I noticed this note in the IMXRT1050 Reference Manual, p.3141 :
I wonder if this could be related? Apart from that one jump the rest of the transfer curve looks smooth, and the ADC has enough noise that averaging together 1000 samples gets you better than 1 LSB of resolution. (Some ADCs get "stuck" on each value, so averaging together many samples still shows 1-LSB steps on a slow ramp, but that's not the case here.)
regards, john