We've been working on a time sensitive application where we need to react every 25 microsecs (should be easy enough for T3.2 @96Mhz).

We were having occasions where we the processor would get hogged for about 100microsecs for no apparent reason.
After running through all our code, we eventually started commenting out code until we singled this out to the Serial.write(...) instruction.

Apparently the first time we call it, the processor (including interrupts) get hogged for these +-100 microsecs. After this initial delay, then no hogging of the processor happens again for next characters being transmitted.

So far, what we've been able to do, to work around this, is to send a large string over USB upon booting. This seems to actually fix the issue.

Another point that is unclear is if this happens on first boot or whenever we connect to the Virtual COMM port. We haven't dedicated the time to actually test this but it there seems to be some extra delay happening as well, whenever we disconnect and reconnect to the VCP.

My point is: is there any way we can pre empively initialize or run whatever code that is causing this delay?
We have sorted it out for now, but it'd be good if something could be done about this (if nothing else, at least some documentation).

Thank you