I’ve seen posts questioning the accuracy of the micros() or millis() function.
I wrote a sketch that measures the number of micros() in an accurately known period of time. I made these tests on a Teensy 3.2 but I believe the same 16 MHz crystal may be used on various other Teensy versions. I used the PPS output of a GPS module to provide a precise one pulse per second trigger and tallied micros() over a 60 second period.
A perfect Teensy would measure a million micros() in one second. The four Teensies that I measured ranged from 999980 to 999983 micros() in one second. This might be accurate enough for some applications. But if you were making a clock, for example, 999980 micros() in a second would accumulate an error of 1.72 seconds per day—nearly a minute in a month. By correcting for the actual number of micros() in a second you can greatly improve timing accuracy.
It would be wonderful if the micros() error were constant but it varies with temperature. I made the same measurement over a range of temperature from 67 ’F to 79 ’F. Temperature was measured with a thermistor connected to an analog input on the Teensy. The data show fractional values for micros() per second because it is averaged over a one-minute period. With these data, it’s relatively easy to correct for temperature variation in software. Of course, each Teensy may have its own unique error and temperature variation.
Comments, questions are welcome.
I wrote a sketch that measures the number of micros() in an accurately known period of time. I made these tests on a Teensy 3.2 but I believe the same 16 MHz crystal may be used on various other Teensy versions. I used the PPS output of a GPS module to provide a precise one pulse per second trigger and tallied micros() over a 60 second period.
A perfect Teensy would measure a million micros() in one second. The four Teensies that I measured ranged from 999980 to 999983 micros() in one second. This might be accurate enough for some applications. But if you were making a clock, for example, 999980 micros() in a second would accumulate an error of 1.72 seconds per day—nearly a minute in a month. By correcting for the actual number of micros() in a second you can greatly improve timing accuracy.
It would be wonderful if the micros() error were constant but it varies with temperature. I made the same measurement over a range of temperature from 67 ’F to 79 ’F. Temperature was measured with a thermistor connected to an analog input on the Teensy. The data show fractional values for micros() per second because it is averaged over a one-minute period. With these data, it’s relatively easy to correct for temperature variation in software. Of course, each Teensy may have its own unique error and temperature variation.
Comments, questions are welcome.