Hi,
I am using this datalogger tutorial as the basis for a.. well, datalogger.
https://www.arduino.cc/en/tutorial/datalogger

I'm losing time (millis) at a pretty consistent interval(s).

In the tutorial, 3 analog pins are being read, written to sd card and spit out the the serial monitor

in my test.. i am just writing millis() to the sd card, no serial monitor and no pins.

Each write takes about 3-4 milliseconds.. however, after 6.8 seconds (varies per test - but are constant) i get a time drop. example: 1000 milliseconds was recorded, then the next line would be 1070 and the millis count continues from there... 6.8 seconds later.. another 50 milli jump and continues to count from there.

I'm using a teensy 3.6, with a class 10 san disk 32g uSD drive.

What could be the issue? with 8 analog sensors, spitting to the serial monitor, I get the same issue

I was investigating using an external RTC, but the resolution is to low

Any Ideas?