Problem with falling edge interrupt delaying 1-2 microseconds inconsistently

Status
Not open for further replies.

goomysmash

New member
I'm working on a project that reads the serial signal coming from a GameCube controller using a Teensy 3.2 (datasheet)

The attached image shows how the GameCube serial signal writes data and how I want to read it with my Teensy 3.2. Here is the wiring diagram. I'm using the Arduino IDE to upload the code.

Basically the serial signal goes low for 1 microsecond then high for 3 microseconds when it's writing a 1, and 3 microseconds low then 1 high when it's writing a 0. The way that I read it is by having an interrupt function when there's a falling edge on the data line (which happens at the start of every bit no matter what), then waiting 2 microseconds so it's in the middle of the bit, then storing the bit as a 1 if it's high and a 0 if it's low. That's what the "ideal behavior" on the diagram is showing, my Teensy 3.2 working correctly. It does this most of the time.

However, sometimes it has these random delays in when the interrupt starts where sometimes it will read about 1-2 microseconds later than when it should, so it will read at the wrong time and get the bit wrong.

Here is a copy of my code

So 1 other thing, I've been trying 2 different microcontrollers for this. The Arduino Nano and the Teensy 3.2. I first started out with an Arduino Nano, which runs at 16Mhz. With the Nano, writing the outputs takes so many clock cycles normally that I don't even need to include the line delayMicroseconds(2);. It takes about 2 microseconds for it to even write so it just happens to align fairly well with the timing I want when I don't include any delays. It still has the issue of randomly taking too long when starting the interrupt though.

When I'm trying the same thing on the Teensy 3.2 running at 96Mhz I do need to include the line delayMicroseconds(2);. However, I still get the same problem, where it randomly takes a few microseconds longer than it should and ends up reading too late and getting the bit wrong.

Some solutions I've already tried:

-Using nops in assembly to delay 2 microseconds instead of delayMicroseconds(2);, I still get the same results.

-Delaying by only 1 microsecond so there is a 3 microsecond window to get the read timing right, the interrupt still has so much variation I get the bit wrong sometimes, on the Teensy and Nano

I suspect that the problem is either inside of the interrupt itself or might be caused by some delay that makes it so the interrupt happens later than it should. I don't know. I asked on another forum and was told to look into bitbanding so I can reduce the amount of instructions for writing. So while I'm looking into that does anyone have some other suggestions to try?

If you can help I'd be eternally grateful! Thanks for taking the time to read this.
 
I suggest you remove all the UNO register stuff in your program ( DDRD, PORTD, PIND etc ). These registers do not exist in the Teensy, they are faked for compatibility. Use digitalReadFast and digitalWriteFast.

Another idea that may or may not work depending upon your timing requirements on the output pin is to use Serial to receive your input signal ( wire your input to the RX of one of the hardware serial ports ). With the correct baud rate such that 10 bits spans 4us, you will receive two very different characters for each 1 or 0 sent on the input.
 
I suggest you remove all the UNO register stuff in your program ( DDRD, PORTD, PIND etc ). These registers do not exist in the Teensy, they are faked for compatibility. Use digitalReadFast and digitalWriteFast.

Another idea that may or may not work depending upon your timing requirements on the output pin is to use Serial to receive your input signal ( wire your input to the RX of one of the hardware serial ports ). With the correct baud rate such that 10 bits spans 4us, you will receive two very different characters for each 1 or 0 sent on the input.

Ok, I tried using digitalReadFast and digitalWriteFast and avoiding using the Arduino ports. Same result. It looks like to me on the Serial Monitor that the fastest baud rate is 250,000? That's 1 pulse every 4 microseconds if I'm doing my math right, which means it would only be able to read once every pulse... it might work but I'm looking for a more robust solution. I'll dive into assembly if necessary, as that seems to be what most other projects have done to solve this issue.

Here's a link to my updated code

Here's a link to what my oscilloscope looks like
 
Your oscilloscope picture is not really great for making careful measurements, but to me it looks like your delay is a bit too long. Part of your delay will be interrupt latency, so it needs to be shorter than what you might think. You could try using an elapsed micros timer instead of a just a raw delay. Then if your routine gets interrupted by system interrupts, it will not result in a longer delay.

Or you could just turn off interrupts during your delay. And that may be a good thing to try anyway as a way to see if system interrupts are causing the issue.
 
IMHO, interrupts are not the optimal way for this purpose. Why not use an FTM and capture both signal edges, rising and falling ? There, without any latency, you get a reading of about 48 for an 1us hi or lo time, and around 144 for a 3us hi or lo time, which makes it easy to distinguish your 1 and 0 sequences, even when the input timing is somewhat off for whatever reason. Look at the source code of the FreqMeasureMulti library for implementation details.
 
Status
Not open for further replies.
Back
Top