Hi,
I'm using a Teensy 4 and 'FreqCount' to measure frequencies in the range 1KHz to 60MHz (Higher would be good!).
The user is able to select the gate time in the range 1mSec to 10Seconds, trading precision for speed.
I'm using a very high speed comparator to drive the Teensy and a signal generator to check performance.
The drive waveform at Teensy Pin 9 is good.
My code is working perfectly well up to 60MHz with gate times of 100mSec, 1 Seconds and 10 Seconds.
I'm adjusting the gate time very slightly at 1 Second and 10 Seconds to correct for the Teensy's xtal error. (Works well).
My problem is with short gate times and I suspected my code.
However, I'm also able to reproduce the problem with the example code, so maybe not me:
With the delay() commented out the result is as expected, a 40MHz input shows 40000 +/-1
With the delay enabled the result is 40000417.
It looks to me as if the sampling period is being set by the loop() time and not the interval defined
by FreqCount.begin(1000*1).
Am I doing something stupid?
(It wouldn't be the first time! )
RichardL
I'm using a Teensy 4 and 'FreqCount' to measure frequencies in the range 1KHz to 60MHz (Higher would be good!).
The user is able to select the gate time in the range 1mSec to 10Seconds, trading precision for speed.
I'm using a very high speed comparator to drive the Teensy and a signal generator to check performance.
The drive waveform at Teensy Pin 9 is good.
My code is working perfectly well up to 60MHz with gate times of 100mSec, 1 Seconds and 10 Seconds.
I'm adjusting the gate time very slightly at 1 Second and 10 Seconds to correct for the Teensy's xtal error. (Works well).
My problem is with short gate times and I suspected my code.
However, I'm also able to reproduce the problem with the example code, so maybe not me:
Code:
#include <FreqCount.h>
void setup() {
Serial.begin(57600);
FreqCount.begin(1000*1); // Gate time in uSec
}
void loop() {
if (FreqCount.available()) {
unsigned long count = FreqCount.read();
Serial.println(count);
//delay(1000);
}
}
With the delay() commented out the result is as expected, a 40MHz input shows 40000 +/-1
With the delay enabled the result is 40000417.
It looks to me as if the sampling period is being set by the loop() time and not the interval defined
by FreqCount.begin(1000*1).
Am I doing something stupid?
(It wouldn't be the first time! )
RichardL
Last edited: