Hi all,
I recently picked up a Teensy 3.6. I've used Teensy and Arduino for several other projects but I've never pushed timing as hard as I need to right now so I'm looking for some guidance.
My program will be measuring the time it takes to discharge the energy stored in a coil, usually around 100 microseconds. The program needs to do the following:
digital read pins 11 & 12 "continuously"
When pin 11 goes high, start a timer with an accuracy of +/-0.1 microseconds (100 nano, or better if possible)
When pin 12 goes high, stop the timer with the same accuracy
Pass the measured value into an if statement to turn on LED's, write out to an LCD screen, compare measured value to upper and lower bounds, etc (this can all be done in a leisurely 3-5 seconds)
What I'm looking for is help with the nanosecond timer. What's the easiest way to obtain timing accuracy of +/- 100 nanoseconds or less? I've already used an oscilloscope and found a digitalRead() to take a pretty steady 100 nanoseconds...so two of these instructions means I have a total measurement error of at least 200 nano. An additional 100 nano of error in the timer itself would bring overall error up to 300 nano which is acceptable...but not much more than that.
Suggestions?
Thank you!
Dylan
I recently picked up a Teensy 3.6. I've used Teensy and Arduino for several other projects but I've never pushed timing as hard as I need to right now so I'm looking for some guidance.
My program will be measuring the time it takes to discharge the energy stored in a coil, usually around 100 microseconds. The program needs to do the following:
digital read pins 11 & 12 "continuously"
When pin 11 goes high, start a timer with an accuracy of +/-0.1 microseconds (100 nano, or better if possible)
When pin 12 goes high, stop the timer with the same accuracy
Pass the measured value into an if statement to turn on LED's, write out to an LCD screen, compare measured value to upper and lower bounds, etc (this can all be done in a leisurely 3-5 seconds)
What I'm looking for is help with the nanosecond timer. What's the easiest way to obtain timing accuracy of +/- 100 nanoseconds or less? I've already used an oscilloscope and found a digitalRead() to take a pretty steady 100 nanoseconds...so two of these instructions means I have a total measurement error of at least 200 nano. An additional 100 nano of error in the timer itself would bring overall error up to 300 nano which is acceptable...but not much more than that.
Suggestions?
Thank you!
Dylan