I'm working on a controller for a GameCube game. It uses a Teensy++ 2.0 to act as the controller.
When I started experimenting with making macros for the game (chains of inputs that are precisely spaced in time), I found that elapsedMillis() and elapsedMicros() seem to be very loose with their timings. It seems like the more complicated the code the board is running, the more jittery the timing.
What's strange to me is that the degree of the timing jitter is around a frame at 60 fps. So that's around 16 ms. What's even stranger, is that I'm having inputs showing up a frame before they are supposed to sometimes.
I even did really simple tests with a padhack of a GameCube controller. I used a Teensy 3.2 patched into two different buttons on a GameCube controller, and simply had the Teensy complete the circuit for those buttons when told to at specific intervals and had the same results. Sometimes I would get frame perfect input, sometimes it would be a frame early, and sometimes it would be a frame late.
I thought it might be a problem with the way the input is polled in the game, but I had the same results with two different games. I also did tests printing out timings to serial while testing in game, and according to the serial, the timing is almost perfect.
Does anyone know why this might be? Is there a way to get around this? I would like the inputs to show up on the exact frame I want. I don't think I can use any time critical applications to achieve it however, because I am using a library that is time critical to communicate with the GameCube.
When I started experimenting with making macros for the game (chains of inputs that are precisely spaced in time), I found that elapsedMillis() and elapsedMicros() seem to be very loose with their timings. It seems like the more complicated the code the board is running, the more jittery the timing.
What's strange to me is that the degree of the timing jitter is around a frame at 60 fps. So that's around 16 ms. What's even stranger, is that I'm having inputs showing up a frame before they are supposed to sometimes.
I even did really simple tests with a padhack of a GameCube controller. I used a Teensy 3.2 patched into two different buttons on a GameCube controller, and simply had the Teensy complete the circuit for those buttons when told to at specific intervals and had the same results. Sometimes I would get frame perfect input, sometimes it would be a frame early, and sometimes it would be a frame late.
I thought it might be a problem with the way the input is polled in the game, but I had the same results with two different games. I also did tests printing out timings to serial while testing in game, and according to the serial, the timing is almost perfect.
Does anyone know why this might be? Is there a way to get around this? I would like the inputs to show up on the exact frame I want. I don't think I can use any time critical applications to achieve it however, because I am using a library that is time critical to communicate with the GameCube.