I decided to get the logic analyzer out and measure how long the timer interrupts take in the TeensyTimerTool library with a Teensy 4.0. I varied both CPU speed and Timer clock speed (I used the GPT timer). As expected, at the higher CPU speeds, the Timer clock speed becomes a bigger factor in how long the interrupt takes. At 600 MHz CPU speed, the fastest frequency is about 2 MHz using the 24 MHz timer clock and 4 MHz using the 150 MHz timer clock.
|
150 MHz |
450 MHz |
600 MHz |
816 MHz |
24 MHz |
700 |
480 |
440 |
|
150 MHz |
560 |
270 |
240 |
180 |
Interrupt time in nsec, CPU Speed (across top) vs Timer Clock Speed (down left)
I measured the time by continuously toggling an output in the main loop. When an interrupt occurs, the pulse stops. The interrupt time can be measured by how long the pulse stops (see picture below). The only thing done in the timer callback was to toggle another output high and low - about 4-8 clock cycles, so pretty negligible.

As a check, I also decreased the timer period until the toggling output stopped completely, indicating that the CPU is starved. These times were very similar to what was measured above, maybe slightly longer.
Here is the code used to create these measurements:
Code:
#include "TeensyTimerTool.h"
using namespace TeensyTimerTool;
PeriodicTimer t1 (GPT2);
void setup() {
pinMode (12, OUTPUT);
pinMode (20, OUTPUT);
t1.begin (callback, 1.000);
}
void callback()
{
digitalWriteFast(20, 1);
asm ("dsb");
digitalWriteFast(20, 0);
}
void loop() {
while (true) {
digitalWriteFast(12, 1);
asm ("dsb");
digitalWriteFast(12, 0);
asm ("dsb");
}
}