virtualdave
Well-known member
Hi all,
I try to avoid having any delays in my code so it doesn't lock up the micro, but for this one case it's WAAAY less painful to add a 50us delay in the code than determine some elapsed time interval. I've also never fully understood serial buffers, but do know that "long" delays and receiving messages over serial don't mix well. So my quick question is: with only a 50us delay, if a message comes in at that moment the micro is in the delay will that incoming byte be lost, or will it be waiting for me once I'm out of the delay? This is using the t3.
Thank you!
David
I try to avoid having any delays in my code so it doesn't lock up the micro, but for this one case it's WAAAY less painful to add a 50us delay in the code than determine some elapsed time interval. I've also never fully understood serial buffers, but do know that "long" delays and receiving messages over serial don't mix well. So my quick question is: with only a 50us delay, if a message comes in at that moment the micro is in the delay will that incoming byte be lost, or will it be waiting for me once I'm out of the delay? This is using the t3.
Thank you!
David