Forum Rule: Always post complete source code & details to reproduce any issue!
Results 1 to 6 of 6

Thread: GPIO toggle jitter

  1. #1
    Junior Member
    Join Date
    Jul 2014
    Melbourne, Australia

    GPIO toggle jitter

    I am using a teensy 3.1 to toggle a GPIO (pin 0) as fast as possible. I have noticed that there is considerable jitter in the pulses which I measured to be ~350-440ns. If I had to guess, there is some kind of background interrupt running on the Teensy (an interrupt I did not set up).

    Click image for larger version. 

Name:	WP_20160401_12_39_53_Pro.jpg 
Views:	68 
Size:	285.7 KB 
ID:	6893

    I do not need anything besides this GPIO pin and the SPI bus (interrupts enabled for receiving in slave mode using btmcmahan's SPI library). Is there a way to disable any background interrupts? Ideally, the only jitter/interrupt should be is when the SPI receives data (which I measured is ~1.2 us).

    Code below:

     #include "t3spi.h" // Library code to use the hardware SPI in slave mode
     T3SPI SPI_SLAVE; //initalise the T3SPI class as SPI_SLAVE
     #define dataLength 2 //only two integer per SPI packet
     volatile uint16_t data[dataLength] = {}; //initalise empty data array
     volatile uint16_t delayCount = 15; //initalise the delay counter (time the pulse stays LOW)
     volatile uint16_t pulseCount = 15; //initalise the pulse counter (time the pulse stays HIGH)
     uint16_t Tpulse = 0; //temporary pulse counter
     uint16_t Tduty = 0; //temporary delay counter
     byte initalised = 0; //the pulses are OFF until it gets a command from SPI
    void setup() {
      pinMode(0,OUTPUT); // set the TTL output as an output (pin 0)
      digitalWriteFast(0,LOW); //set the TTL low
      pinMode(1,OUTPUT); // used for measuring the SPI ISR time
      digitalWriteFast(1,LOW); //turn it off for now.
      //Begin the SPI in SLAVE:
      SPI_SLAVE.begin_SLAVE(SCK, MOSI,MISO,CS0); //pin 13,11,12,10 respectively
      SPI_SLAVE.setCTAR_SLAVE(16, SPI_MODE0); //16 bit frame size, SPI mode 0
      NVIC_ENABLE_IRQ(IRQ_SPI0); //Enable the interrupt for SPI0
    void loop() {
        //this loop modifies pin 0 (laser TTL) as fast as possible
        Tpulse = pulseCount; //set the temporary pulse couter
        Tduty = delayCount; //set the temporary duty counter
        digitalWriteFast(0,HIGH); //ON
        while (Tpulse != 0){ //wait until the pulse counter reaches zero
          Tpulse--; //decrement the pulse counter
        digitalWriteFast(0,LOW); //OFF
        while (Tduty != 0){ //wait until the delay counter reaches zero
          Tduty--; //decrement the delay counter
    // ISR for the SPI0 interface
    void spi0_isr(void){
      digitalWriteFast(0,LOW); //make sure the laser is off
      digitalWriteFast(1,HIGH); //Set this pin high to measure the computation time for this ISR
      initalised = 1;
      //Handles the data transfer:
      // get the data from the buffer:
      SPI_SLAVE.rx16(data, dataLength);
      //split up the data into the two types:
      delayCount = data[1] & 0x03FF; //delay count is the lower 10 bytes
      pulseCount = (data[1] >> 12) & 0x000F; //pulse count is the upper 4 bytes
      digitalWriteFast(1,LOW); //end of the ISR, reset the debug pin to zero.

  2. #2
    Senior Member
    Join Date
    Nov 2015
    This sounds familiar...

    I know there is a SYSTICK interrupt that's used to keep track of time elapsed with Micros(). I don't know where it's enabled though

  3. #3
    Junior Member
    Join Date
    Jul 2014
    Melbourne, Australia
    Quote Originally Posted by Xenoamor View Post
    This sounds familiar...

    I know there is a SYSTICK interrupt that's used to keep track of time elapsed with Micros(). I don't know where it's enabled though
    Thanks for that. The jitter does go away when I call noInterrupts() in the setup routine. But then I cannot use the SPI interrupt.

    Looking at the documentation for the ARMv7-M I tried to turn off the SYSTICK timer by writing 0 to the SYST_CSR register during the setup function (setting the LSB to zero stops the counter). This had no effect on the jitter.

      SYST_CSR = 0; //disable SYSTICK timer
    I think searching for a single running interrupt is like looking for a needle in a haystack (so to speak). It would be better to just clear all interrupt registers (is there a global interrupt register?), and then only re-enable the SPI interrupt. I think I need to study the reference manual some more, but the ARM manuals are no where near as clear or concise as the AVR ones.

  4. #4
    Senior Member Jp3141's Avatar
    Join Date
    Nov 2012
    Your setting of delayCount and pulseCount isn't atomic -- you could get a SPI interrupt between setting Tpulse and Tduty in loop() -- best to wrap those 2 statements in noInterrupts().

    You can change priorities for interrupts in the NVIC.

    Instead of using code to pulse on and off -- can you use analogWrite -- ? This uses the hardware FTM module to generate signals with zero jitter.
    Last edited by Jp3141; 04-02-2016 at 07:50 PM. Reason: typos

  5. #5
    Senior Member
    Join Date
    Jun 2013
    So. Calif
    Most any MCU system will have some jitter as you have seen. At the least, systick interrupts, often 1mSec per, introduces a few microseconds of CPU usage. Systicks need to be left enabled as a rule.
    To get really precise microsecond level pulse timing, you need to use a hardware timer set to toggle a GPIO bit on each overflow. Or PWM.
    Or there is a way to use DMA and a circular list of bit conditions.

  6. #6
    Senior Member+ Frank B's Avatar
    Join Date
    Apr 2014
    Germany NRW
    Yes, PWM is what his loop does.
    Canca, have you tried the Hardware-PWM ?

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts