I have something that is working but I don't fully understand. This is with Teensy2, running a serial channel to a WindowsXP laptop.
My concern is not missing 200 uSec period timer interrupts due to serial channel latency. It looks like I am not missing interrupts, but there is a 7 mSec delay through recv_str() I do not understand.
I am doing:
Doing a send_str() then a recv_str() and reading a timer to check for the delays. The v_count variable is updated in the 200 uSec interrupt routine. The delay from the start of recv_str() and the end of recv_str() is approximately 7 mSec. I am receiving 3 bytes. I was concerned, because recv_str() calls usb_serial_getchar(), which turns off interrupts with cli().
However, the v_count variable is getting updated as expected, indicating that this 7 mSec delay is interruptable.
Looking further at recv_str()
The large delay is between the start of the routine to before the first call of usb_serial_getchar(); between timer3_rs and timer3_rs1[0]. And apparently it is interruptable. The actual usb_serial_getchar() processing is a few uSec.
Any explanation of what is going on? And what the compiler doing that puts all the delay before the getchar() routine? The 7 mSec delay is acceptable for my application, as long as my interrupts get processed, which apparently is happening. But it is always nice to know what is going on.
recv_str() and usb_serial_getchar() are from the web site examples, other than the CHECK_LATENCY code added.
Thanks,
TLB
My concern is not missing 200 uSec period timer interrupts due to serial channel latency. It looks like I am not missing interrupts, but there is a 7 mSec delay through recv_str() I do not understand.
I am doing:
Code:
send_str(PSTR("ZI\r\n")); // Request data
#ifdef CHECK_LATENCY
timer3_do2=TCNT3; // Read counter
count_2=v_count;
#endif
n = recv_str(inbuf, sizeof(inbuf));
#ifdef CHECK_LATENCY
timer3_do3=TCNT3; // Read counter
count_3=v_count;
#endif
Doing a send_str() then a recv_str() and reading a timer to check for the delays. The v_count variable is updated in the 200 uSec interrupt routine. The delay from the start of recv_str() and the end of recv_str() is approximately 7 mSec. I am receiving 3 bytes. I was concerned, because recv_str() calls usb_serial_getchar(), which turns off interrupts with cli().
However, the v_count variable is getting updated as expected, indicating that this 7 mSec delay is interruptable.
Looking further at recv_str()
Code:
uint8_t recv_str(char *buf, uint8_t size)
{
int16_t r;
uint8_t count=0;
#ifdef CHECK_LATENCY
timer3_rs=TCNT3; // Read counter
count_rs=v_count;
#endif
while (count < size) {
#ifdef CHECK_LATENCY
timer3_rs1[count]=TCNT3; // Read counter
count_rs1[count]=v_count;
#endif
r = usb_serial_getchar();
#ifdef CHECK_LATENCY
timer3_rs2[count]=TCNT3; // Read counter
count_rs2[count]=v_count;
...
The large delay is between the start of the routine to before the first call of usb_serial_getchar(); between timer3_rs and timer3_rs1[0]. And apparently it is interruptable. The actual usb_serial_getchar() processing is a few uSec.
Any explanation of what is going on? And what the compiler doing that puts all the delay before the getchar() routine? The 7 mSec delay is acceptable for my application, as long as my interrupts get processed, which apparently is happening. But it is always nice to know what is going on.
recv_str() and usb_serial_getchar() are from the web site examples, other than the CHECK_LATENCY code added.
Thanks,
TLB