nanosecond precision

Status
Not open for further replies.

nikitchm

New member
I'm trying to make a controller for a Cambridge Technology digital servo driving galvo mirrors in a 2-photon microscopy setup. The controller is required to generate signals following an SPI-like protocol with ~200 ns pulses for the clock, and data (CS/2channels) transmitted along the way. The commands consist of 16-bit words and have to be sent every 10 ms. Given the 72 MHz teensy 3.1 processor, it can be done. However, if the interrupts are not disabled, the code is interrupted from time to time (in quite a regular way), which screws up the controller.

Turning off the interrupts and timing everything with __asm__("nop\n\t") is an option, but makes the code ugly and very inflexible (at least, it is desirable to be able to use interrupts to schedule the 10 ms blocks).

The question is: Is it possible to turn the unnecessary teensy functionality off, without completely disabling interrupts? I've tried using (and modifying) the 'bare-metal' makefile-based deployments by Karl Lunt for Windows (link), as well as by Kevin Cuzner's makefile for Linux (link), but haven't succeed with either of them at the moment. However, the .hex file produced by Karl results in a completely predictable behavior of teensy (no unasked for interrupts), making me believe the 'bare-metal' deployment is a solution here.

I'd appreciate any suggestions.

Thank you,
Max
 
Maybe reassigning interrupt priorities would work? That way, the critical interrupt could always get immediate service, even if lower priority interrupts are running?

With Teensyduino, most interrupts default to priority 128. The USB defaults to 112 (lower numbers are higher) and hardware serial defaults to 64.

All these are easily changable, either by editing the code or just using NVIC_SET_PRIORITY(irqnum, priority).
 
I was trying to measure signals with ~ 40 ns resolution some time ago, and found that the interrupts occasionally (1/1000 of the events) appeared to get delayed for ~ 5 us. I never got to the bottom of it, but guessed that the fact that the CPU's memory comes from flash, and some of this is cached may make interrupts not 100 % predictable. My conclusion was that it was probably better to use the built-in timer/counters for this type of waveform processing than just software.

Why not actually use a SPI ?

Which part of the signals getting stretched does in fact mess the controller ? Even with interrupts, the sequence of signals should still be correct.
 
Status
Not open for further replies.
Back
Top