teensy 3.1 elapsedMicros seems to have bugs

Status
Not open for further replies.

keithTO

Member
Hi everyone,

I am trying out a very simple program to send a fixed decimal number(e.g number 10) to computer through USB Serial at fixed timing intervals.

Here is a screenshot of the program:
Code:
elapsedMicros sinceLastRead;

void setup() {
  Serial.begin(9600);
}

void loop() {
  if (sinceLastRead > 1000 * 10) {
    sinceLastRead = 0;
    Serial.println(10);  
  }
}


This code works fine when I send one message per 10 ms (i.e 1000 * 10 micro second). The timing of received message is measured on my computer(0.01s delay between messages):
Screen Shot 2014-07-04 at 15.42.06.png

However, If I change the timing to one message per 1 ms, the received timing is wrong(computer is getting .00004s delay between messages):
Code:
elapsedMicros sinceLastRead;

void setup() {
  Serial.begin(9600);
}

void loop() {
  if (sinceLastRead > 1000 * 1) { // change is HERE
    sinceLastRead = 0;
    Serial.println(10);  
  }
}

Screen Shot 2014-07-04 at 15.43.48.png

Since my only change is the check on elapsedMicros, I am feeling this is a bug on the library/hardware (or am I missing something obvious? )

Any help is appreciated!
 

Attachments

  • Screen Shot 2014-07-04 at 15.39.22.png
    Screen Shot 2014-07-04 at 15.39.22.png
    21.8 KB · Views: 155
  • Screen Shot 2014-07-04 at 15.43.57.png
    Screen Shot 2014-07-04 at 15.43.57.png
    21.4 KB · Views: 156
could it be that with a short time interval, multiple USB dataums are combined and sent as a single USB frame?
(Inherent to USB in this mode)?

1mSec precision might not be doable.

you could time-tag sent datums at the source; time-of-receipt could have a lot of jitter due to the above and different USB drivers and operating system latencies.
 
Last edited:
Thanks Stevech for the quick response ! Looking at my computer's measurement, it seems the computer is getting data a lot faster(much faster than 1ms) when I try to send one message per 1 ms. If data is combined wouldn't be slower instead of faster?

My end goal is to send microphone data(sampled on port AD0) to computer and do some processing there. My ideal sampling rate is 44k. Am I approach this problem correctly?

Thanks!
 
combining data in one USB frame (as subframes if you will), this would tend to give all the bytes in all "subframes" about the same time of receipt, i.e., faster.

44Ksps. I think you have to time tag each sample so the receiving end can know the delta-t of the actual samples, independent of the transmission delays an Operating system delays.
Maybe the time tag is a delta not an absolute so that the time tages are a few bytes (say, a 16 bit uint), where the LSB is what makes sense in the sampling at 44Ksps

The audio library in Teensy 3 might have examples and be helpful.
 
Without the PC-side code, I can't recreate your test.

Instead, I added 3 lines to your program, to pulse pin 2. Here's the code:

Code:
elapsedMicros sinceLastRead;

void setup() {
  Serial.begin(9600);
  pinMode(2, OUTPUT);
}

void loop() {
  if (sinceLastRead > 1000 * 1) { // change is HERE
    sinceLastRead = 0;
    digitalWrite(2, HIGH);
    Serial.println(10);  
    digitalWrite(2, LOW);
  }
}

I tested on a Teensy 3.1 at 96 MHz with USB Type set to Serial, and using Teensyduino 1.20-rc1.

Then I used my oscilloscope to view the waveform on pin 2. Here's scope screenshot.

file.png
(click for full size)

As you can see, the scope is showing 1 pulse on each 1 ms division. As nearly as I can tell, Teensy seems to be running the loop at the correct speed.

Maybe there's some other problem. Without details to recreate your test (you didn't even say with operating system you're running), this is the best I can do.
 
Also, it might be worth mentioning a very minor issue.

The proper way to get repeating on a 1000 us interval is this:

Code:
  if (sinceLastRead > 1000 * 1) { // change is HERE
    sinceLastRead -= 1000 * 1;
    // do stuff here
  }

If you set "sinceLastRead" to zero, you could "lose" any increment it makes between the time your "if" condition runs and the time it's written. An interrupt might occur, for example. By subtracting 1000, and increment it's made will be preserved. Software latency can still cause jitter, but overall you'll get the correct rate, even if interrupts delay the write to sinceLastRead.
 
Thank you Paul! I didn't post the computer side so not to complicate the issue(programming language, OS, library variations etc)

Here is the environment I used: OSX 10.9.3, and I am using Python to get the USB data. Below is the complete code:
Code:
import serial
import datetime

USBPORTNAME = '/dev/tty.usbmodem406541'
USBBAUDRATE = 9600

def printUSB():
    ser = serial.Serial(USBPORTNAME, USBBAUDRATE)
    waveform = list()
    try:
        ts = datetime.datetime.now()
        while True:
            data = int(ser.readline())
            waveform.append(data)
            ttmp = datetime.datetime.now()
            print "data:%d since_last_message::%.6f" % (data, (ttmp - ts).microseconds/1e6)
            ts = ttmp
    except:
        print "Done"

if __name__ == "__main__":
    printUSB()

Since you have shown the timing is correct on Oscilliscope, the issue is probably on USB transmission side like stevech suggested :) Thanks you so much for the clarification!
 
I still have that board connected, printing "10" rapidly, so I ran your Python code on my Linux machine. I only changed the device name to /dev/ttyACM0.

Here's the output I see (emphasis added, 2 places).

Code:
data:10 since_last_message::0.000085
data:10 since_last_message::0.000089
data:10 since_last_message::0.000088
data:10 since_last_message::0.000088
data:10 since_last_message::0.000088
data:10 since_last_message::0.000088
data:10 since_last_message::0.000088
data:10 since_last_message::0.000088
data:10 since_last_message::0.000088
data:10 since_last_message::0.000094
data:10 since_last_message::0.000087
data:10 since_last_message::0.000085
data:10 since_last_message::[B]0.029642[/B]
data:10 since_last_message::0.000110
data:10 since_last_message::0.000075
data:10 since_last_message::0.000069
data:10 since_last_message::0.000068
data:10 since_last_message::0.000068
data:10 since_last_message::0.000068
data:10 since_last_message::0.000067
data:10 since_last_message::0.000068
data:10 since_last_message::0.000068
data:10 since_last_message::0.000067
data:10 since_last_message::0.000066
data:10 since_last_message::0.000068
data:10 since_last_message::0.000067
data:10 since_last_message::0.000067
data:10 since_last_message::0.000068
data:10 since_last_message::0.000067
data:10 since_last_message::0.000067
data:10 since_last_message::0.000067
data:10 since_last_message::0.000074
data:10 since_last_message::0.000068
data:10 since_last_message::0.000068
data:10 since_last_message::0.000067
data:10 since_last_message::0.000066
data:10 since_last_message::0.000068
data:10 since_last_message::0.000067
data:10 since_last_message::0.000068
data:10 since_last_message::0.000067
data:10 since_last_message::0.000067
data:10 since_last_message::0.000068
data:10 since_last_message::0.000067
data:10 since_last_message::0.000067
data:10 since_last_message::[B]0.029992[/B]
data:10 since_last_message::0.000106
data:10 since_last_message::0.000074
data:10 since_last_message::0.000069
data:10 since_last_message::0.000068
data:10 since_last_message::0.000076
data:10 since_last_message::0.000070
data:10 since_last_message::0.000068
data:10 since_last_message::0.000067
data:10 since_last_message::0.000068
data:10 since_last_message::0.000068
data:10 since_last_message::0.000068
data:10 since_last_message::0.000067
data:10 since_last_message::0.000068
data:10 since_last_message::0.000067
data:10 since_last_message::0.000067
data:10 since_last_message::0.000066
data:10 since_last_message::0.000067
data:10 since_last_message::0.000068
 
Sorry I am not familiar with USB protocol, but aren't we supposed to see 0.001 seconds delay between messages ?

I changed the teensy code a little bit so that each time the data is incremented by 1(instead of constant 10)

It seems the received data content is correct. There is no skipped message or duplicated messages(which makes me think USB transmission should be right, no combined or splitted frames?):
Screen Shot 2014-07-04 at 17.31.44.png
 
My guess: the main issue here is likely limited resolution in Python's datetime.datetime.now() function.

User mode (not real-time) operating system scheduling of the Python interpreter might also be playing a minor role.

On the Teensy side, the USB stack waits a few milliseconds before sending a partial packet, to optimize throughput. If you want lower latency, you can override that with Serial.send_now();

Packet scheduling on the USB host controller chip and driver could also play a role.

Teensy is running your code 1000 times per second (or perhaps slightly slower due to the issue I mentioned above). Between your Serial.write() and Python's ser.readline() is a tremendous amount of USB hardware and driver software, not to mention the OS-X operating system, which are designed to optimize throughput while maintaining moderate latency bounds (eg, under 1/10 to 1/4 second that humans can notice in interactive response).
 
Python's serial module is written in Python - and is fine for UART baud rates, but probably a bit overhead-laden for USB virtual serial at 100's of Kbps.
 
I have some problems with elapseMicros and teensy 3.2. It is a video frame simulator, so relatively fast USB etc. I am using 3 elapseMicros timer, is there some limit how many can be used?

if I use elapsedMillis to time a frame be sent every 114 ms it is all ok and video is send 8.7 FPS. if I use elapseMicros it looks that sometimes, maybe every 20 frames or so it sends the frame too soon?

why would elapseMillis and elapseMicros work differently?

This is not really a problem for the 114 ms timer, but the two other timers are much faster, and I have had some timing problems, but difficult to say if it is because of elapseMicros.

Edit, after some more testing, I am now seeing the same problem with elapseMillis, so it is likely the USB buffer etc.
 
Last edited:
It is obviously hard to give you complete answer as don't see any code nor really know what the issue is.

I have almost never used these classes before, but there is no magic behind, them.

They are simply a nice wrapper to using the funnctions micros and millis and I will probably use them in new code. I will talk about millis, but the exact is true for micros as well.

All it does is when you create one of these objects, is there is a member variable(ms) that is set to the current millis(). When you access the object to ask for current value, it simply returns: millis()-ms.
If you set it to zero, is simple sets: ms = millis(). Actually when you set it a value it simply sets: ms = millis() - value

So why do I mention this. These values are only as accurate as the code that uses it. That is if you wish to have the code trigger when one of these values is 114, a common mistake would be of comparing it for equality for 114, and if you miss it and the value is 115...

Another issue if you wish for something to start every 114ms is if for example it takes 120ms before you actually execute it and if you set the value to 0, the next frame will start 114ms after the delayed frame... If you wish to have it start at a specific interval instead of setting it to zero, you subtract off the frame time from the variable... Hope that makes sense.

As for your code, if you are using USB... Well could be maybe you had a delay in USB... Again don't know what type of host, or the like.
 
Thanks KurtE, It was many things related also to the other timers I had. First I thought elapseMillis was working perfectly, but turned out it had also the same problem, just not as often. I an not using == to compare.

The thing I wonder is why elapseMillis and Micros appear to work somewhat differently, but it might be my code, cleaning it up now.
 
Status
Not open for further replies.
Back
Top