I have some code to read signals from a scientific instrument. It clips at a tad under 3 volts, so I would expect my highest readings using analogRead to be nearly full-scale (so the better part of 4095-ish for 12 bits, the better part of 65535-ish for 16 bits etc.). Instead, I am getting max readings in the high 860s (so, the better part of 10 bits).
So, I hooked the analog line up to the 3.3 volt ref, and, sure-enough, I got readings of 1023.
The literature for the Teensy says it has a 16 bit A/D that is good to 12 bits of precision. Is there a way to enable this? In my application, I would be grateful for the added precision.
Did some poking around, but did not turn up any posts that addressed this.
Thanks
So, I hooked the analog line up to the 3.3 volt ref, and, sure-enough, I got readings of 1023.
The literature for the Teensy says it has a 16 bit A/D that is good to 12 bits of precision. Is there a way to enable this? In my application, I would be grateful for the added precision.
Did some poking around, but did not turn up any posts that addressed this.
Thanks