Serial1.available() delay in TD 1.54?

aaknitt

New member
I'm seeing some unexpected delay in the Serial1.available() returning a value after serial data is present. This is using a Teensy 3.2 with Arduino 1.8.15 and TD 1.54. My minimal test to show the issue has the Serial1 TX and RX pins tied together for loopback with this test code:

Code:
#define LED_PIN 13

void setup() {
  // put your setup code here, to run once:
  Serial.begin(115200);
  pinMode(LED_PIN,OUTPUT);
  Serial1.begin(19200);
}

void loop() {
  // put your main code here, to run repeatedly:
  uint8_t temp_byte = 0;
  digitalWrite(LED_PIN,HIGH);
  Serial1.write(0x55);
  while (!Serial1.available()){}
  temp_byte = Serial1.read();
  //Serial.println(temp_byte);
  digitalWrite(LED_PIN,LOW);
  Serial1.write(0x02);
  while (!Serial1.available()){}
  temp_byte = Serial1.read();
  //Serial.println(temp_byte);
  digitalWrite(LED_PIN,HIGH);
  delay(100);
  digitalWrite(LED_PIN,LOW);
  delay(300);
}

Scope trace looks like this:
teensyserial1.jpg

If I run the same code an an Arduino Mega I don't see the same delay when Serial1.available() gets called:
megaserial1.jpg

Finally, if I go back to the Teensy 3.2 and replace the Serial1.available() wait with Serial1.flush() (to ensure that the byte has made it out before calling Serial1.read() ), the delay is gone and the values are read back correctly (confirmed by printing them out via USB serial):
Code:
#define LED_PIN 13

void setup() {
  // put your setup code here, to run once:
  Serial.begin(115200);
  pinMode(LED_PIN,OUTPUT);
  Serial1.begin(19200);
}

void loop() {
  // put your main code here, to run repeatedly:
  uint8_t temp_byte = 0;
  digitalWrite(LED_PIN,HIGH);
  Serial1.write(0x55);
  Serial1.flush();
  //while (!Serial1.available()){}
  temp_byte = Serial1.read();
  Serial.println(temp_byte);
  digitalWrite(LED_PIN,LOW);
  Serial1.write(0x02);
  Serial1.flush();
  //while (!Serial1.available()){}
  temp_byte = Serial1.read();
  Serial.println(temp_byte);
  digitalWrite(LED_PIN,HIGH);
  delay(100);
  digitalWrite(LED_PIN,LOW);
  delay(300);
}
teensyserial2.jpg

I'm wondering what might be causing the delay in Serial1.available() reporting that data is available when a Serial1.read() call seems to be showing that data is actually there. Is this normal/expected? I'm not completely sure, but I think it may be causing me some timeout issues in some other legacy code that I believe was working in the past, but possibly with a different Arduino/TD version.

Thanks,

Andy
 
I changed your program as shown below to measure execution time using micros() and to print the number of microseconds spent waiting for Serial1.available() after Serial1.write(). For 19200 baud, the output of the program is as shown below, and I get exactly the same results with Arduino 1.8.13 / TD 1.53 and Arduino 1.8.15 / TD 1.54. The "flight time" of one byte at 19200 baud 8-N-1 is about 520 us. I'm not sure where the extra 500 us delay is coming from, but I don't see any difference between TD 1.53 and TD 1.54. This is all with Teensy 3.2.

55 1100
02 1101
55 1099
02 1101
....

For 115200 baud, the times from write() to available() are about 185 us, which, again, are about 2x the one-byte "flight time". It seems like it should be 1x, but I could be missing something.

Code:
#define LED_PIN 13

void setup() {
  // put your setup code here, to run once:
  Serial.begin(115200);
  pinMode(LED_PIN,OUTPUT);
  Serial1.begin(19200);
}

void loop() {
  // put your main code here, to run repeatedly:
  uint8_t temp_byte = 0;
  uint32_t t0,t1;
  
  digitalWrite(LED_PIN,HIGH);
  Serial1.write(0x55);
  t0 = micros();
  while (!Serial1.available()) {}
  t1 = micros();
  temp_byte = Serial1.read();
  Serial.printf( "%02hX %5lu\n", (uint16_t)temp_byte, t1-t0 );
  
  digitalWrite(LED_PIN,LOW);
  Serial1.write(0x02);
  t0 = micros();
  while (!Serial1.available()) {}
  t1 = micros();
  temp_byte = Serial1.read();
  Serial.printf( "%02hX %5lu\n", (uint16_t)temp_byte, t1-t0 );
  
  digitalWrite(LED_PIN,HIGH);
  delay(100);
  digitalWrite(LED_PIN,LOW);
  delay(300);
}
 
As another test, I modified loop() to send first 1 byte, then 2 bytes, etc., and measured the total time spent waiting for N bytes to be received. The wait time is always the flight time for N+1 bytes. So, it looks like there is a 1-byte "delay" in sending the first serial byte. I looked at the code in cores\Teensy3\serial.c, and while I don't understand it all, I can see that when the buffer is empty, the transmitter is enabled, and it looks like the code then waits for the TDRE flag to be set (Transmit Data Register Empty). Perhaps what is happening is that it takes 1 byte-time for that condition to occur.

I don't know if that one-byte-time delay could be eliminated. I checked the K20 reference manual, and the behavior of the UART flags is pretty complex. Perhaps someone with more knowledge of that code will confirm or otherwise comment.


Code:
#define LED_PIN 13

void setup() {
  // put your setup code here, to run once:
  Serial.begin(115200);
  pinMode(LED_PIN,OUTPUT);
  Serial1.begin(19200);
  while (!Serial) {}
}

int nbytes = 1;

void loop() {
  // put your main code here, to run repeatedly:
  uint8_t temp_byte = 0;
  uint32_t t0,t1;
  
  for (int i=0; i<nbytes; i++)
    Serial1.write(0x55);
  t0 = micros();
  for (int i=0; i<nbytes; i++) {
    while (!Serial1.available()) {}
    temp_byte = Serial1.read();
  }
  t1 = micros();
  Serial.printf( "%02hX %5lu %5lu\n", (uint16_t)temp_byte, nbytes, t1-t0 );
  
  if (++nbytes > 10)
    nbytes = 1;
  
  digitalWrite(LED_PIN,HIGH);
  delay(100);
  digitalWrite(LED_PIN,LOW);
  delay(300);
}
 
under what condition is this delay a problem? Normally, you don't wait for a byte to be sent. You just fill the buffer...
 
hm, even if yes: under what condition is this delay a problem? Normally, you don't wait for a byte to be sent. You just fill the buffer...

Agree it probably shouldn't be an issue. I was dealing with a LIN bus library (where TX and RX are tied together) that was waiting to see that it had successfully sent the frame header prior to looking for the response bytes. The added delay was odd when monitoring the bus on a scope. I suspected that it may have been causing some timing issues but I don't think that was actually the case (I think it was more related to Serial1.end() and Serial1.begin() being called to create the serial break). I switched to a different LIN library that seems to be working, but the delay still seemed odd...wasn't sure if it was a symptom of something that wasn't quite right.

Andy
 
Yeah, maybe @Kurt can take a look, sometime. Under special circumstances it may make a small difference.
 
For Teensy 3.2, the delay exists for UART0 and UART1 (which have a FIFO), but not for UART2 (which has no FIFO), and you can eliminate the delay for UART0 and UART1 by reducing the RX FIFO watermark from 4 (default) to 1. There is no delay in the TX, but rather a delay in the RDRF interrupt on the RX side, which is what delays the Available() status.

UART0 and UART1 (with FIFO) have ILIE (Idle Line Interrupt Enable), and in the status_isr(), the receive logic is executed if either of the RDRF or IDLE interrupt flags is set. I think what is happening is that the byte is received, but there is no RDRF because the RX FIFO is below the watermark. After one byte period, the IDLE interrupt flag is set, and that is what triggers the status_isr() to read the byte from the (RX) data register and set Available = true.

I went back and looked at the output of the program that sends 1 byte, then 2 bytes, then 3, etc, and sure enough, the delay exists for 1, 2, or 3 bytes, but not for 4 bytes, which is the default RX FIFO watermark. If you are going to send one-byte or very short messages, a work-around to the delay on the RX side to is to set UARTx_RWFIFO = 1.
 
Last edited:
For Teensy 3.2, the delay exists for UART0 and UART1 (which have a FIFO), but not for UART2 (which has no FIFO), and you can eliminate the delay for UART0 and UART1 by reducing the RX FIFO watermark from 4 (default) to 1. There is no delay in the TX, but rather a delay in the RDRF interrupt on the RX side, which is what delays the Available() status.

I haven't found this in the documentation, but it seems likely that when there are bytes in the RX FIFO, but the count is below the watermark, the UART will eventually fire the RDRF interrupt. Perhaps this is why UARTs with FIFOs have ILIE (Idle Line Interrupt Enable), and perhaps this happens after an idle time equal to one byte?

If you are going to send one-byte or very short messages, a work-around to the delay on the RX side to is to set UARTx_RWFIFO = 1.

That may actually help explain some of the "bad" behavior I was seeing related to the serial break generation. I seemed to be losing bytes when the Serial1.end() -> Serial1.begin() took place. I'm wondering if there were bytes in the FIFO that maybe got stuck due to the high water mark never being reached? That may not make sense if there were an idle timeout that triggered .Available() though. Regardless, using a LIN library that generates a serial break correctly (using the serial break bit in the UART port register) rather than hackishly (Serial.end() -> GPIO pin toggle -> Serial.begin() ) solved my issue.
 
Okay. I'm not familiar with LIN, but at least for Teensy, end() would stop transmission and probably also result in pin configuration changes, so it would be difficult or impossible to use begin() and end() for "control". Take a look at cores\Teensy3\serial1.c to see what I mean.
 
Yeah, maybe @Kurt can take a look, sometime. Under special circumstances it may make a small difference.
Normally if I am doing something like half duplex protocol, I will do the SerialX.flush(); to make sure everything has been fully shifted out.

Also this code actually goes through and waits for ending ISR to be called that says the transfer is fully done.

With T4.x when the RX receives a character, the time out can be one whole character time without receiving anything before the system will trigger an ISR. So in the more recent releases of TD, we have code that when you ask for Avail, it will check to see if the hardware has a byte ready or not and count it with available...

I actually never ran into issues with it on T3.x
 
With T4.x when the RX receives a character, the time out can be one whole character time without receiving anything before the system will trigger an ISR. So in the more recent releases of TD, we have code that when you ask for Avail, it will check to see if the hardware has a byte ready or not and count it with available...

@KurtE, you're right. For both TD 1.53 and TD 1.54, the following are true.


  • For T4.x, available() returns the total number of bytes in the (software) buffer, plus bytes in the (hardware) FIFO, so there is no "delay" per the OP.
  • For T3.x, available() returns only the number of bytes in the buffer, so there can be a delay of up to 1 byte-time from the time a byte is RX'd and is in the FIFO until available() returns > 0 or until read() can return the byte value.

The T4.x approach requires disabling interrupts within both available() and read(), so that is a trade-off.
 
Back
Top