Hello All,

I have an xmodem 1k implementation that I have written for Teensy 3.2 that works great. When porting over to Teensy 4.0 I saw a huge error rate when receiving bytes. Particularly on the count of bytes received that is returned by:

int read_bytes = Serial.readBytes((char *) &xmodem_block, sizeof(xmodem_block));
I later do a check to be sure all the bytes were read in by:

else if (read_bytes == sizeof(xmodem_block))
...and catch the incorrect read_bytes with the else statement and return a NAK to the sender, which forces them to resend the data.

So, everything is working as expected with the protocol, but my finding is that the count (maybe even the data) is corrupt when running at any clock speed (600, 450, 24MHz). I thought maybe it could have been the instantiation and assignment happening inside the for loop reading from serial, so I moved the instantiation outside of the loop. Then I started adding in delays in different places. I found that adding a delay at the top of the for loop (which would happen right after responding to the sender) the errors stopped. It has be a bit stumped as to why the problem would arise in the first place. Can anyone shed some light on this by chance, and possibly lead me to a solution that would not require adding this delay? As of now it works, but not at the speed it really could be. And I am sure one small change could also render similar effects.

My real question is, why would the delay fix the problem unless it is reading bytes faster than the sender is sending them? Could adding a condition to wait for the sender to finish before reading?

The delay works for now, but I am going to force the bytes read in order to try to achieve this instead. If that works I will post back, but I feel like serial USB should be handling this honestly.