Teensy 3.6, SD Card, missing time in milliseconds
Hi,
I am using this datalogger tutorial as the basis for a.. well, datalogger.
https://www.arduino.cc/en/tutorial/datalogger
I'm losing time (millis) at a pretty consistent interval(s).
In the tutorial, 3 analog pins are being read, written to sd card and spit out the the serial monitor
in my test.. i am just writing millis() to the sd card, no serial monitor and no pins.
Each write takes about 3-4 milliseconds.. however, after 6.8 seconds (varies per test - but are constant) i get a time drop. example: 1000 milliseconds was recorded, then the next line would be 1070 and the millis count continues from there... 6.8 seconds later.. another 50 milli jump and continues to count from there.
I'm using a teensy 3.6, with a class 10 san disk 32g uSD drive.
What could be the issue? with 8 analog sensors, spitting to the serial monitor, I get the same issue
I was investigating using an external RTC, but the resolution is to low
Any Ideas?
Teensy 3.6, SD Card, missing time in milliseconds
@ mborgerson: Again, thanks for the code example.
Quote:
Originally Posted by
Craig Larson
Wouldn't that make a 1528 byte buffer ... ?
My mistake: strike 1528 and replace with 1536.
For others following this thread: I also found that writes to the SD card caused a voltage sag on connected circutry despite using a large capacitor. This caused a dent in my ADC readings. To solve this I did two things: 1. The SD Card reader was given a dedicated power supply. 2. Each ADC reading was adjusted by by a modified VCC based on AREF. Andreas Spiess shows how to do this in his video #10.
But now I've got silky smooth ADC readings indicating that the sensors have very little noise.