delayMicroseconds & Serial.read

Status
Not open for further replies.

virtualdave

Well-known member
Hi all,
I try to avoid having any delays in my code so it doesn't lock up the micro, but for this one case it's WAAAY less painful to add a 50us delay in the code than determine some elapsed time interval. I've also never fully understood serial buffers, but do know that "long" delays and receiving messages over serial don't mix well. So my quick question is: with only a 50us delay, if a message comes in at that moment the micro is in the delay will that incoming byte be lost, or will it be waiting for me once I'm out of the delay? This is using the t3.

Thank you!
David
 
For rule of thumb you can take the UART bps, divide by ten, for start/stop overhead, to get characters per second. Divide one by the result and you get the time per character. Multiple by the UART buffer size to get the max time you have.

For example

57600 / 10 = 5760 characters per second
1 / 5760 = 0.1736 milliseconds per character
0.1736 * 8 = 1.3888 milliseconds​

So unless you are running at really high serial speeds a 50 microsecond delay should be no problem.
 
Status
Not open for further replies.
Back
Top