Serial options for setting and disabling timeouts

maelh

Member
I have run into the same issues as mentioned in this thread:
https://forum.pjrc.com/threads/3198...ransfer-over-USB?p=90685&viewfull=1#post90685

It is more critical for my use case to have complete and reliably transferred messages, than keeping up the speed at all costs. With lost or garbled messages I need to add additional error checking, and the protocol becomes a lot more complex.
Reliable message delivery also allows to convey "type" information by using the order in which it arrives. Think of structs/arrays.

This also allows an implicit flow control, because essentially data is sent as fast as possible from both sides, but if buffers are full/one side does not respond fast enough, the stalling automatically slows down the sender as needed.

Suggestion:
There are already settings for the baud rate. There could be an overload of Serial.begin() that can set TX_TIMEOUT in usb_serial.c (making it a variable instead of a constant), and additionally a parameter/flag/option to disable the timeout completely.

Disabling it would be equivalent to disabling the lines quoted in the above post:
Code:
if (++wait_count > TX_TIMEOUT || transmit_previous_timeout) {
    transmit_previous_timeout = 1;
    return -1;
}

Similarily, it would be very useful if the timeout or blocking behavior for Serial.readBytes() and related reading functions could be set.
 
Until you provide clear and comprehensive info about the conditions causing this problem, I will not consider or even discuss any changes in Teensy's USB stack.

You haven't posted complete code (eg, the "Forum Rule"). There's no info about nature & speed of data flow. You've not said if the data movement is unidirectional, or if bidirectional what ratio of IN vs OUT is used. We don't have any idea if you're doing some sort of send-ack protocol, or purely streaming, or a hybrid approach. We don't know what sort of software is on the receiving side. In fact, we don't even know if you're using Windows, Mac or Linux!

You've written almost every word about your analysis of Teensy's USB code. But without test results that are reproducible, this is a waste of time. I am not going to consider any of this, unless you actually show (don't "tell", but "show") the conditions that cause data loss.
 
unless you actually show (don't "tell", but "show") the conditions that cause data loss.
This is merely a more concrete reminder of this issue.
There is a very detailed analysis in the thread I linked, including some code samples. So I thought you were aware of the details already, since you participated in that thread and mentioned the solution I linked to.

But the major reason is: it's quite common to have control over serial communication's timeout and disable it entirely. I wasn't sure why that needed a detailed usecase.

I'd like to point out I respect your work and feedback highly as an enginner and programmer. It's truly excellent.

Given that I asked other questions that were ignored, where I put a lot of effort in wording/research, this reply was a bit unpleasant.

P.S.: I do understand that there is a huge effort involved in supporting this community. I could provide code patches, but that will probably also entail questions about design decisions or logic of the existing code to integrate it properly. So I am not sure how much that would reduce the workload.
 
Last edited:
As far as I know, that matter was fully resolved. On that thread, message #56 says:

I implemented binary packet-based communication and a python listener using a dedicated thread. I don't see any dropped packets anymore.


Look, many many times someone has written an analysis of the code, without providing any actual test conditions. We try over and over again on this forum to say that code & details to reproduce a problem is required.
 
Ok, I understand.

Currently the program is quite complex and I get stalls/deadlocks that are probably due to some race conditions.
Somehow DAC0 is reading at the same time Serial is writing.
I haven't solved the issue yet or been able to minimize the code sufficiently and still trigger the issue.

But, when I have no timeouts in Serial, this allows for more predictable and therefore debuggable behavior.

The basic design is producer consumer, PC is producer, Teensy is the consumer.
The PC generates either a data block (samples for the DAC) or a list of commands for the Teensy to process. The commands allow to pause/ resume the streaming or to query other information from time to time at the user's discretion. The querying is not used yet/implemented yet.
It alternates between sending a fixed amount of samples and fixed amount of commands to avoid the overhead of tagging each sample with a header.

So Teensy is mostly a consumer, and currently sends data back only each time it receives a data block for debugging purposes only, technically I want to avoid using ACKs.

The PC runs Windows, the streaming is as fast as the Teensy can handle the communication, but limited by the DAC that gets triggered at 48kHz by the PDB.

When Serial drops some of those debug messages, it is hard to know if something actually wasn't called or just the debug message got lost.

I have a related question to this, where I added more examples from Teensy code that I'd like to understand:
https://forum.pjrc.com/threads/4931...nd-IDLY-values?p=166460&viewfull=1#post166460
 
Last edited:
Back
Top