Teensy 3.6 Parallel Interface with 16 bit ADS8556 ADC 630kSPS

Status
Not open for further replies.

taterbot

New member
Hi all,

I'm using a Teensy 3.6 to read from an ADS8556 ADC over the 16 bit parallel interface. I've having trouble with getting the timing tight enough to read at the max data rate of 630kSPS. I'd be happy to get to around 600kSPS. The part supports simultaneous sampling across its 6 channels, and I only need 5 data channels.

I'm overclocking with f_cpu at 240 MHz. The first place where I don't seem be getting good performance is pulling CS' low on the falling edge of the BUSY signal coming out of the ADC. The falling edge of BUSY indicates that the conversion is done and ready for readout. I've tried using attachInterrupt on BUSY going into a digital input on the Teensy, but it takes about 300nS before between the falling edge of BUSY and the digitalWriteFast to CS'. My best attempt as of now is to use a while loop, polling BUSY for the falling edge. The code below results in a ~115nS delay between the fallling edge of BUSY to the falling edge (digitalWriteFast) of CS'.
BUSY is bit 1 of PORT B
Code:
digitalWriteFast(CONVST, HIGH);
delayNanoseconds(300);
while ((GPIOB_PDIR & (1 << 11)) && CS_Low());

FASTRUN bool CS_Low() {
    digitalWriteFast(CS_, LOW);
    return true;
}

The second and perhaps the more consequential part that's under optomized is reading over the parallel interface. I have each of the 16 data bits connected to a digital input on the Teensy. The ADS8556 has a RD' input pin, that needs a falling edge to trigger the next ADC channel's data to be presented on the data bit pins.
Code:
for (int i = 0; i<NUM_CHAN*2; i+=2) {
    digitalWriteFast(RD_, LOW);
    delayNanoseconds(32);
    DataFrame[i] = GPIOA_PDIR;
    DataFrame[i+1] = GPIOC_PDIR;
    digitalWriteFast(RD_, HIGH);
    delayNanoseconds(10);
}
Each iteration of the loop above runs in 110nS. Will I need to use DMA to speed this up? I would then need to also need to synchronize toggles of RD' to iterate through each ADC channel. Would a possible approach be to use DMA to read from the port registers triggered by a PDB, then use IntervalTimer to create the pulse train on RD'? and use a delay of ~32nS to offset phase of the DMA and RD' toggle? This seems pretty hacky, but I'm not sure how else to speed this up.

Is there a way to tie the DMA trigger to the RD' pin?

Thanks in advanced!
 
Consider using a teensy 4.1. Faster CPU and 16 contiguous bits with a single read. I use a different ADC with different handshaking, but run a single channel at 12 Msps.

The ADS8558 will convert a little faster. Can you leave CS low all the time? the 32 and 10 delays can maybe be a little lower (since there are some additional delays). Tune with an oscilloscope.
 
Last edited:
The reason I'm using Teensy 3.6 is that I need a DAC, which the 4.1 doesn't have.

From what I can tell in the datasheet for ADS8556, CS' needs to be high for when the conversion happens. Maybe I'm misinterpreting the timing diagram?

Yeah, the 32 and 10 nS may be tuned lower, though in a different test where I wasn't optimizing for speed, I found that anything <32 gave me junk data bits
 
Status
Not open for further replies.
Back
Top