teensy 3.2 vs. 4.0 IntervalTimer

Status
Not open for further replies.

dougm

Well-known member
I have a fast pseudo-serial signal that I'm trying to decode manually by looking at the state at certain intevals. so when I detect the start of the packet I start an interval timer that lands me in the middle of each bit. So far so good.

The problem I had with the 3.2 was that by the time the timer started I had lost the first bit of the packet.

I got a 4.0 thinking I could throw speed at the problem. The results are odd, though.

The amount of time it takes to detect the beginning of the packet (interrupt falling) went down dramatically, from 1.5uS to 200nS

But the amount of time it takes to start the clock didn't change at all. I still miss the first bit of the packet.

Timer commands:

IntervalTimer myTimer;
myTimer.begin(tripClk, 8.6);
myTimer.end();

So my questions are whether there is architecturally a better way to do this, or whether there is something I can do to start the timer more quickly? (maybe a hidden suspend/resume command?)

Just for clarity it is key that I am able to align a clock pulse to (or close to) the beginning of the packet. So starting the clock indeterminately mid-cycle won't really help

Thank you
 
Last edited:
Can you provide some more information?

  • What is your data clock frequency?
  • I assume that you want the first call of trpClk at T/2. After that it should be called repeatedly with the data clock, right?
 
Fastest I could get with TeensyTimerTool was some 850ns delay after triggering. I used a TMR (QUAD) based timer in OneShot mode to get the short first delay and then retriggered in the ISR to get the repeated call at the clock frequency. The frame clock was generated on pin 0 which was jumpered to pin1 (frame input). Of course this is all just a quick proof of principle.

startdelay.png

Code:
#include "Arduino.h"

#include "TeensyTimerTool.h"
using namespace TeensyTimerTool;

OneShotTimer trpClkTimer(TMR1);  // TMR @150MHz, prescaler 1

uint32_t bitCnt;

void trpClk()
{  
  digitalWriteFast(3, HIGH);  
  delayNanoseconds(80);
  digitalWriteFast(3, LOW);
  if(--bitCnt > 0)
  {
    trpClkTimer.trigger(1.15);  // retrigger the timer
  }   
}

void pinISR()
{
  trpClkTimer.trigger(0.2);  // first call 
  bitCnt = 8;  
}

void setup()
{
  pinMode(0, OUTPUT);       // packet signal generator
  pinMode(1, INPUT_PULLUP); // packet signal input (jumper 0->1)
  
  //pinMode(2, OUTPUT);       // packed detected output
  pinMode(3, OUTPUT);       // timer isr;

  attachInterrupt(1, pinISR, RISING);
  trpClkTimer.begin(trpClk);
}

void loop()
{
  digitalWriteFast(0, !digitalReadFast(0)); 
  delay(10);
}
 
teensy 3.2 signal detect.jpg

So on the attached scope trace the orange line is the serial stream, the red line (T) is the point at which I detect the signal (this is on the 3.2) and the brown top line is the clock. I start the clock at T.

If I take a measurement at every clock you will see that I will get a 01010111100100...

Whereas the actual bits are 001010111100100...

each bit is 8.6 uS wide so that makes it ~116Khz

Thank you
 
Choose 3.7µs for the first delay and 8.0 µs for the second. (You need to hand optimize the numbers a bit for those short times) This will give

startdelay2.png

Teensy 4.0 of course
 
Luni I have successfully implemented your code and it is working great. Thank you so much!!

DougM
 
Status
Not open for further replies.
Back
Top