goomysmash
New member
I'm working on a project that reads the serial signal coming from a GameCube controller using a Teensy 3.2 (datasheet)
The attached image shows how the GameCube serial signal writes data and how I want to read it with my Teensy 3.2. Here is the wiring diagram. I'm using the Arduino IDE to upload the code.
Basically the serial signal goes low for 1 microsecond then high for 3 microseconds when it's writing a 1, and 3 microseconds low then 1 high when it's writing a 0. The way that I read it is by having an interrupt function when there's a falling edge on the data line (which happens at the start of every bit no matter what), then waiting 2 microseconds so it's in the middle of the bit, then storing the bit as a 1 if it's high and a 0 if it's low. That's what the "ideal behavior" on the diagram is showing, my Teensy 3.2 working correctly. It does this most of the time.
However, sometimes it has these random delays in when the interrupt starts where sometimes it will read about 1-2 microseconds later than when it should, so it will read at the wrong time and get the bit wrong.
Here is a copy of my code
So 1 other thing, I've been trying 2 different microcontrollers for this. The Arduino Nano and the Teensy 3.2. I first started out with an Arduino Nano, which runs at 16Mhz. With the Nano, writing the outputs takes so many clock cycles normally that I don't even need to include the line delayMicroseconds(2);. It takes about 2 microseconds for it to even write so it just happens to align fairly well with the timing I want when I don't include any delays. It still has the issue of randomly taking too long when starting the interrupt though.
When I'm trying the same thing on the Teensy 3.2 running at 96Mhz I do need to include the line delayMicroseconds(2);. However, I still get the same problem, where it randomly takes a few microseconds longer than it should and ends up reading too late and getting the bit wrong.
Some solutions I've already tried:
-Using nops in assembly to delay 2 microseconds instead of delayMicroseconds(2);, I still get the same results.
-Delaying by only 1 microsecond so there is a 3 microsecond window to get the read timing right, the interrupt still has so much variation I get the bit wrong sometimes, on the Teensy and Nano
I suspect that the problem is either inside of the interrupt itself or might be caused by some delay that makes it so the interrupt happens later than it should. I don't know. I asked on another forum and was told to look into bitbanding so I can reduce the amount of instructions for writing. So while I'm looking into that does anyone have some other suggestions to try?
If you can help I'd be eternally grateful! Thanks for taking the time to read this.
The attached image shows how the GameCube serial signal writes data and how I want to read it with my Teensy 3.2. Here is the wiring diagram. I'm using the Arduino IDE to upload the code.
Basically the serial signal goes low for 1 microsecond then high for 3 microseconds when it's writing a 1, and 3 microseconds low then 1 high when it's writing a 0. The way that I read it is by having an interrupt function when there's a falling edge on the data line (which happens at the start of every bit no matter what), then waiting 2 microseconds so it's in the middle of the bit, then storing the bit as a 1 if it's high and a 0 if it's low. That's what the "ideal behavior" on the diagram is showing, my Teensy 3.2 working correctly. It does this most of the time.
However, sometimes it has these random delays in when the interrupt starts where sometimes it will read about 1-2 microseconds later than when it should, so it will read at the wrong time and get the bit wrong.
Here is a copy of my code
So 1 other thing, I've been trying 2 different microcontrollers for this. The Arduino Nano and the Teensy 3.2. I first started out with an Arduino Nano, which runs at 16Mhz. With the Nano, writing the outputs takes so many clock cycles normally that I don't even need to include the line delayMicroseconds(2);. It takes about 2 microseconds for it to even write so it just happens to align fairly well with the timing I want when I don't include any delays. It still has the issue of randomly taking too long when starting the interrupt though.
When I'm trying the same thing on the Teensy 3.2 running at 96Mhz I do need to include the line delayMicroseconds(2);. However, I still get the same problem, where it randomly takes a few microseconds longer than it should and ends up reading too late and getting the bit wrong.
Some solutions I've already tried:
-Using nops in assembly to delay 2 microseconds instead of delayMicroseconds(2);, I still get the same results.
-Delaying by only 1 microsecond so there is a 3 microsecond window to get the read timing right, the interrupt still has so much variation I get the bit wrong sometimes, on the Teensy and Nano
I suspect that the problem is either inside of the interrupt itself or might be caused by some delay that makes it so the interrupt happens later than it should. I don't know. I asked on another forum and was told to look into bitbanding so I can reduce the amount of instructions for writing. So while I'm looking into that does anyone have some other suggestions to try?
If you can help I'd be eternally grateful! Thanks for taking the time to read this.