Following on from a previous thread regarding serial data corruption over USB, likely due to using the 96MHz overclock on a Teensy3.0.
The project uses a Teensy3.0 to decode the read-data stream from a double density disk in an Atari ST disk drive.
I replaced the 3.0 with a 3.2 board. I copied the sketch in case changes were needed, however the code compiled without needing any changes. Unfortunately reading data from floppy disk sectors has become very unreliable with about 1 read in 10 resulting in the correct data. The same code on the 3.0 was rock solid.
My first guess was that something was different with respect to the FTM timer source clock. When reading the read-data pulses from the floppy, I simply need to be able to distinguish between long, medium and short pulses. I have some test code which prints a crude histogram. Using this I was able to see where to set the limit values. On the 3.0 the histogram was surprisingly repeatable. On the 3.2 it looks different, but I think that it is more or less the same shape.
I was concerned that the clock used to clock the FTM might be different on the 3.2 as compared to the 3.0. I am no expert with the hardware of the Teensy CPU. I think that I selected the "system clock" to clock the FTM, but TBH I don't think I really knew it's frequency. I prescaled it by dividing by 4, which gave reasonable numbers in the range 0 to 127 when reading back the count. Not the most professional way, in that I didn't rigorously understand how the FTM worked/was clocked.
A FTM count value of 70 is about average for a 6us edge-to-edge data bit. I guess that makes "system clock" for the 3.0 as 48MHz (NOTE I am using a 96MHz overclock), which I think makes sense (48/4 = 12 , 70/6 = 11.6).
So on the 3.2 what is the frequency of the system clock (with 96MHz overclock)?
I can also display the raw data bytes. An interesting observation for the 3.2 is that the header is always read correctly, and the CRC is always correct. This is only a short burst of pulses, the header being 4 bytes long. When a sector reads incorrectly it is obvious from not only the ASCII hex bytes dumped to the serial terminal, but also because the CRC is wrong.
One guess is that I am getting an "unexpected" interrupt during the sector read which changes the value for one pulse or even loses a pulse. This will corrupt the data stream from that point. I wonder if this is what I am seeing? Any rogue pulse value causes the decode state machine to go wrong and it can't re-sync until the next header synchronisation "missing clock".
Note: while reading a sector I switch off the "systick" interrupt, which would otherwise cause the same problem.
On the 3.2 are there any new sources of interrupt which might be happening, and which would impact the FTM timer values?
It could possibly be a hardware problem causing a glitch. I'm just searching for a possible software reason first.
Relevant code?
From void setup()
Interrupt routine called on every falling edge of the floppy read data (truncated)
Routine for reading data, which switches off SysTick interrupts and enables interrupts for the index pulse and data stream...
Just noticed my comment says that the system clock is 48MHz.
The project uses a Teensy3.0 to decode the read-data stream from a double density disk in an Atari ST disk drive.
I replaced the 3.0 with a 3.2 board. I copied the sketch in case changes were needed, however the code compiled without needing any changes. Unfortunately reading data from floppy disk sectors has become very unreliable with about 1 read in 10 resulting in the correct data. The same code on the 3.0 was rock solid.
My first guess was that something was different with respect to the FTM timer source clock. When reading the read-data pulses from the floppy, I simply need to be able to distinguish between long, medium and short pulses. I have some test code which prints a crude histogram. Using this I was able to see where to set the limit values. On the 3.0 the histogram was surprisingly repeatable. On the 3.2 it looks different, but I think that it is more or less the same shape.
I was concerned that the clock used to clock the FTM might be different on the 3.2 as compared to the 3.0. I am no expert with the hardware of the Teensy CPU. I think that I selected the "system clock" to clock the FTM, but TBH I don't think I really knew it's frequency. I prescaled it by dividing by 4, which gave reasonable numbers in the range 0 to 127 when reading back the count. Not the most professional way, in that I didn't rigorously understand how the FTM worked/was clocked.
A FTM count value of 70 is about average for a 6us edge-to-edge data bit. I guess that makes "system clock" for the 3.0 as 48MHz (NOTE I am using a 96MHz overclock), which I think makes sense (48/4 = 12 , 70/6 = 11.6).
So on the 3.2 what is the frequency of the system clock (with 96MHz overclock)?
I can also display the raw data bytes. An interesting observation for the 3.2 is that the header is always read correctly, and the CRC is always correct. This is only a short burst of pulses, the header being 4 bytes long. When a sector reads incorrectly it is obvious from not only the ASCII hex bytes dumped to the serial terminal, but also because the CRC is wrong.
One guess is that I am getting an "unexpected" interrupt during the sector read which changes the value for one pulse or even loses a pulse. This will corrupt the data stream from that point. I wonder if this is what I am seeing? Any rogue pulse value causes the decode state machine to go wrong and it can't re-sync until the next header synchronisation "missing clock".
Note: while reading a sector I switch off the "systick" interrupt, which would otherwise cause the same problem.
On the 3.2 are there any new sources of interrupt which might be happening, and which would impact the FTM timer values?
It could possibly be a hardware problem causing a glitch. I'm just searching for a possible software reason first.
Relevant code?
From void setup()
C:
// setup timer config which doesnt change
FTM0_MODE = FTM_MODE_FTMEN | FTM_MODE_WPDIS;
FTM0_MOD = 0xFFFF;
// disable to begin with - done by setting clock to disabled!
FTM0_SC = FTM_SC_CLKS(0) | (PRESCALE); // disable clock + no prescale
Interrupt routine called on every falling edge of the floppy read data (truncated)
C:
FASTRUN void FDD_stream_decode_ISR(void) {
uint8_t timer;
uint32_t isfr = PORTD_ISFR;
PORTD_ISFR = isfr;
digitalWriteFast(debugpin,HIGH);
FTM0_SC = FTM_SC_CLKS(0) | (PRESCALE); // STOP TIMER
timer = (uint8_t)(FTM0_CNT&0xff); // READ COUNT
FTM0_CNT = 0; // RESET COUNT
FTM0_SC = FTM_SC_CLKS(1) | (PRESCALE); // RESTART TIMER
if (readmode==DECODE_EVENTS) {
if (TOOSHORT_DELTA) { badbitcnt++; } // bad pulse - too short!
else if (SHORT_DELTA_4US) { // another same bit 0..0 or 1..1
if (lastbit==0) { bitstream=(bitstream<<1)|0; syncbits=(syncbits<<1)|0; bitcnt++; }
else { bitstream=(bitstream<<1)|1; syncbits=(syncbits<<1)|0; bitcnt++; }
}
else if (MEDIUM_DELTA_6US) { // invert 0...1 or 1...0
if (lastbit==0) { bitstream=(bitstream<<1)|1; syncbits=(syncbits<<1)|0; lastbit=1; bitcnt++; }
else { bitstream=(bitstream<<2)|0; syncbits=(syncbits<<2)|0; lastbit=0; bitcnt+=2; }
}
else { // 8US // adds 01 0..1..0 (special sync only) or 1..0..1
if (lastbit==1) { bitstream=(bitstream<<2)|1; syncbits=(syncbits<<2)|0; bitcnt+=2; }
else { bitstream=(bitstream<<2)|0; syncbits=(syncbits<<2)|3; bitcnt+=2; }
}
Routine for reading data, which switches off SysTick interrupts and enables interrupts for the index pulse and data stream...
C:
index_detected=0; // reset variable - can only be set in interrupt routine
while (index_detected==0); // wait for index pulse
FTM0_SC = FTM_SC_CLKS(1) | (PRESCALE); // system (48MHz) clock + no prescale
FTM0_CNT = 0;
// disable SYSTICK interrupt by setting the interrupt enable bit to zero
SYST_CSR = SYST_CSR_CLKSOURCE; // disable SYSTICK interrupts
clearpending_FDD_stream_decode_ISR();
enable_FDD_stream_decode_ISR();
index_detected=0; // reset variable - can only be set in interrupt routine
while (index_detected==0); // wait for index pulse
disable_FDD_stream_decode_ISR();
clearpending_FDD_stream_decode_ISR();
FTM0_SC = FTM_SC_CLKS(0) | (PRESCALE); // disable clock + no prescale
// re-enable the SYSTICK interrupt by setting the interrupt enable bit to zero
SYST_CSR = SYST_CSR_CLKSOURCE | SYST_CSR_TICKINT | SYST_CSR_ENABLE; // re-enable SYSTICK
//
Just noticed my comment says that the system clock is 48MHz.