Serial Latency T2

Status
Not open for further replies.

tlb

Well-known member
I have something that is working but I don't fully understand. This is with Teensy2, running a serial channel to a WindowsXP laptop.

My concern is not missing 200 uSec period timer interrupts due to serial channel latency. It looks like I am not missing interrupts, but there is a 7 mSec delay through recv_str() I do not understand.

I am doing:

Code:
                send_str(PSTR("ZI\r\n"));    // Request data

#ifdef CHECK_LATENCY
                timer3_do2=TCNT3;    // Read counter
                count_2=v_count;
#endif
                n = recv_str(inbuf, sizeof(inbuf));

#ifdef CHECK_LATENCY
                timer3_do3=TCNT3;    // Read counter
                count_3=v_count;
#endif


Doing a send_str() then a recv_str() and reading a timer to check for the delays. The v_count variable is updated in the 200 uSec interrupt routine. The delay from the start of recv_str() and the end of recv_str() is approximately 7 mSec. I am receiving 3 bytes. I was concerned, because recv_str() calls usb_serial_getchar(), which turns off interrupts with cli().

However, the v_count variable is getting updated as expected, indicating that this 7 mSec delay is interruptable.

Looking further at recv_str()

Code:
uint8_t recv_str(char *buf, uint8_t size)
{
    int16_t r;
    uint8_t count=0;

#ifdef CHECK_LATENCY
        timer3_rs=TCNT3;    // Read counter
        count_rs=v_count;
#endif

    while (count < size) {

#ifdef CHECK_LATENCY
        timer3_rs1[count]=TCNT3;    // Read counter
        count_rs1[count]=v_count;
#endif

        r = usb_serial_getchar();

#ifdef CHECK_LATENCY
        timer3_rs2[count]=TCNT3;    // Read counter
        count_rs2[count]=v_count;

       ...

The large delay is between the start of the routine to before the first call of usb_serial_getchar(); between timer3_rs and timer3_rs1[0]. And apparently it is interruptable. The actual usb_serial_getchar() processing is a few uSec.

Any explanation of what is going on? And what the compiler doing that puts all the delay before the getchar() routine? The 7 mSec delay is acceptable for my application, as long as my interrupts get processed, which apparently is happening. But it is always nice to know what is going on.

recv_str() and usb_serial_getchar() are from the web site examples, other than the CHECK_LATENCY code added.

Thanks,

TLB
 
First, it looks like you're using this serial code, right? What operating system and software is on the other end? These details matter.

Here's the best explanation I can give you, not having a complete picture (not to mention complete code to reproduce the problem) of what you're doing.

USB allocates bandwidth in 1 ms frames. This basic fact imposes a lower limit on the worst case round-trip latency you can ever hope to achieve. It's certainly possible to do much better than the 7 ms you're seeing now, but you really do need to keep a reasonable expectation based on the 1 ms USB frame time.

Most of the latency is a timeout on the Teensy side. The USB code tries to maximize bandwidth by packing subsequent writes into 64 byte USB packets. If you write fewer bytes than necessary to fill a packet, it will delay sending that packet for a few milliseconds. There is a function for low-latency. Here's the relevant stuff from usb_serial.c:

Code:
// immediately transmit any buffered output.
// This doesn't actually transmit the data - that is impossible!
// USB devices only transmit when the host allows, so the best
// we can do is release the FIFO buffer for when the host wants it
void usb_serial_flush_output(void)
{

Notice the explanation in the comment. This function commits a packet to be transmitted the next time the USB host controller in your PC schedules bandwidth. When that happens is completely outside Teensy's control. It should normally happen at least once every USB frame, unless other devices are consuming nearly all the USB bandwidth.

If you use Arduino with Teensyduino, this function is called Serial.send_now().

Of course, once the packet arrives at the USB host controller, it's written into the PC's memory. The host controller will issue an interrupt and the operating system's host controller driver will run, perhaps running the USB serial driver. The operating system will schedule whatever software is waiting for the data to run, but odds are the actual context switch will happen sometime later. When your software does run, hopefully it will get the response transmitted before the OS runs something else? Then it goes into a queue of packets to be scheduled by the USB host controller. Remember that 1ms USB frame time? When the host controller chip actually transmits the reply isn't within your control (at least not unless you do kernel level programming directly to the host controller chip, but even then you'd be constrained by how the chip is designed).

If you use usb_serial_flush_output() or Serial.send_now(), you'll reduce the round-trip latency as much as possible. But you're simply not going to get to 0.2 ms. That's just not feasible with USB.

Generally, to process data that fast with USB, you need to queue up several items into packet, perhaps 20 or more, and send them together. Streaming protocols are FAR better than send-and-wait-reply protocols. A common approach involves batching up many data points into a packet and adding a serial number or unique identifier. If confirmations are needed, you'd send acks asynchronously with those unique numbers, but never stop collecting data and sending while you wait for a reply. I know that's more complex than the extremely simple send-and-wait-reply approach, but for more than about 300 to 500 messages per second over USB, that's simply what you have to do. USB just isn't designed for extremely low latency.

One more thing that might help is the old thread with a latency test I wrote years ago.

http://forum.pjrc.com/threads/7826-USB-to-digital-I-O-delay

If you look at those test results, a common theme is 1 ms worst case latency in all the shortest data sizes. That's the unavoidable USB frame time. There's nothing you can do about that, other than designing your protocol well for the realities of USB communication.
 
Last edited:
Thanks for your reply, and sorry for the delay in responding. Previously I received an email when there was a reply to my posts, but this time I didn't get any email so I didn't realize there was a response.

First, it looks like you're using this serial code, right? What operating system and software is on the other end? These details matter.

Yes, this is your serial code from the website. I'm using WindowsXP running pySerial.

I am not concerned about the USB performance itself; I am transferring very little data over USB. Only a few hundred bytes per second.

What I do have is a timer interrupt routine that is currently set for every 200 uSec.

Code:
ISR(TIMER1_COMPA_vect)
...

I want to be able to guarantee that that interrupt gets thru. My concern was that within the USB routines the interrupts were disabled.

Code:
cli()

And if the interrupts were disabled for the entire 7 mS latency, my interrupt would not be able to get thru. However, the interrupt is apparently getting thru. There is a variable (v_count) that gets incremented during the interrupt, and it continues to get incremented (at least close) to the appropriate amount.

So it appears the compiler is doing something good, and moving most of USB latency outside of the time the interrupts are disabled. And the actual sending of the byte seems quite fast. Which is good. Now I know most of my interrupts are getting thru. But I'm still not sure how to guarantee its not missing an occasional interrupt.

Since then I have decided that with this and other issues I am fighting a problem that can be solved with gates. And gates are cheap. So I ordered a Teensy 3.1 and have switched to that.

Now to get Teensy 3.1 working. I appears the C-code path (non-Teensyduino) is not nearly as nicely documented for Teensy 3.x as for Teensy 2.0. For those of us with a hardware background who want to use some of the specific features of the part itself, but appreciate guidance in getting all software tools set up.

TLB
 
I want to be able to guarantee that that interrupt gets thru. My concern was that within the USB routines the interrupts were disabled. ....
And if the interrupts were disabled for the entire 7 mS latency, my interrupt would not be able to get thru.

The USB code disables interrupts for only very brief times, far less than 1 ms.
 
Status
Not open for further replies.
Back
Top