Synchronizing DAC and ADC for simultaneous use - Teensy 3.2/3.6

Status
Not open for further replies.

Four_FUN

Member
Hello everybody. Just as a heads up I would like to say I won't be able to post much of my code due to the nature of my work but I will try posting relevant pieces as much as possible.

Currently I have set up the PDB to initiate DMA transfers from a look up table to the DAC output at some fixed frequency (4096 or 2048 samples in 5ms, so 1 sample per 1.22us or 2.44us). What I would like to be able to do is to use some multiple of this frequency - sufficiently large to ensure the ADC has successfully sampled and converted the DATA - to initiate sampling on the ADC and when the result is ready to initiate a DMA transfer into a buffer (eventually ping pong buffers but that is a post for a later date).

This is for a radar application and the ADC samples need to be very well aligned with the start and end of the DAC output waveform (let's call it a ramp for simplicity). What this means is that if I output 2048 samples to the DAC and read an ADC value every 4 samples, I would have 512 samples in one ramp pulse that I know are well aligned to this pulse. Hence the transfer to the DAC and initiation of ADC sampling would have to happen concurrently (or close to that) on every fourth pulse. The problem here is that if I use the PDB to time DMA transfers to the DAC I don't know how to use some multiple of this to time the ADC samples. Using the same timing would be way to fast for the ADC.

Therefore I turn to you for help in how to achieve this goal. I am not tied to the PDB by the wat but I would like to stick to DMA transfers if possible.

Thank you very much in advance!

P.S: I have achieved this using the Interval Timer (PIT) interrupts to time everything correctly. Unfortunately there is some jitter when I try write the ADC samples to the PC via USB because I imagine the USB suspends interrupts in order to transfer data which messes up my timing. This is an issue that I may make a separate post about in the future, but I am not sure there is a solution.
 
Suspect the DMA might get you the synchronised pipe you are after. Question of course is digging far enough into the reference manual to find the setup for it. Looking at the Audio library and https://github.com/pedvide/ADC may give you some examples of how you can leverage the internal hardware to do the heavy lifting for you. Some costs to balance here between the time to dig down to find the registers and get them set right or spending the big bucks jumping straight to an FPGA intended for this purpose.

Though my limited exposure to the results of using the radar FPGAs suggest there are plenty of interesting incorrect documentation there as well.
 
Thank you for your reply.
Yeah FPGAs is where this will most likely eventually end up in the future but we needed a quick prototype so something like the Teensy was a good first choice. Of course as all projects go the Teensy is getting more and more ingrained at least for another year (out of my control).

I have taken a look at the Audio library. From what I can tell the PDB_PERIOD there is fixed and the same (44.1 kHz "CD quality") for both the DAC and ADC. What I am looking to do is use some set of timers that can interface with the DAC and ADC and trigger DMA transfers somewhat synchronously. I guess what I am saying was I wish there were two different PDBs I could set two different moduli on. Perhaps you know but could I use another set of timers one as a source to the DMAMUX and one as a hardware trigger to the ADC?

I have also looked at the excellent ADC library but I generally use it only to set up the ADC. While the defensive coding used is great in general, at the speeds I am going I just stripped it down to the most basic register operations.
 
Was more thinking that the registers referenced in those libraries might offer a short cut to the reference manual pages of interest given starting on page one isn't a great way to go with that sucker. The short form data sheet for 3.6 core is
https://www.pjrc.com/teensy/K66P144M180SF5V2.pdf but my google foo is weak this morning and not getting the PJRC version of the full several thousand pages. I believe you can use other timers for the DMA process but that's based on reading forum posts rather than the manual itself.
 
Without checking feasibility or how this could be implemented I would look for the the following:

A two layer DMA with a 4 sample inner loop and a 512 count outer loop for a total of 2048 samples

The end of the inner loop, every 4 samples, triggers the ADC
 
The ADC and DAC have limits to how fast they can work.

The ADC runs from a clock, which you configure with a dividor of the CPU clock. It takes a certain number of those clock cycles to complete a conversoin, which also on the specific settings.

For the DAC, you can write to its registers at incredible speed, but it hardware only has a fixed analog bandwidth. It's much faster than the ~30kHz from Freescale's spec, even when driving a capacitive load. Of course, it's fastest when you use a buffer and minimize capacitance. But you shouldn't expect 1-2 MHz analog bandwidth from its circuitry.
 
The fastest the ADC can work on a Teensy 3.2 at 96 MHz is about 1 us. If you use the T3.6 at a higher clock rate you might get a faster rate, however the accuracy of the results won't be guaranteed.
 
@mlu Unless If I understand the way the DMA transfer work what you suggest is very nearly there but not quite. I thought about doing something similar but every minor loop has to be 2bytes to the DAC data register. If the inner loop is 4 samples long (2bytes/sample*4samples=8bytes) then you will be overwriting the DAC data register as each of the 4 (or 8) DMA transfers happen. Ideally what I would like is an interrupt or flag that I could use to trigger an ADC sample start ever 4 DMA transfers to the DAC.

@Pedvide. First of all thank you for the ADC library. It was an excellent resource and I want to take this opportunity to say again that its very, very, very well written. For what I was doing though I needed some fine level of control (again apologies I can't share much in the way of details) so I essentially went the direct register manipulation way. I don't think I will be sampling the ADC that fast in any use case but its good to know I can do that even if I don't know what I would be doing with the Data.

@PaulStroffregen Let me also take this chance to thank you Paul for the fantastic work on the teensy. One of the better micro-controllers out there by any measure. I am quite aware that the Freescale spec is somewhat off. There was another post you had answered initially that convinced me it was much faster (I believe its included in the tips and tricks post). If I recall from your scope shots the DAC was changing levels in about 1us. In fact I have output a 5ms ramp composed of the full 4096 levels on the DAC. This would mean I change output level every 1.2us. On the scope the ramp looked fine (quite nearly perfect actually) and when using it to drive a VCO I was able to get the full range of frequencies out on a spectrum analyzer.
 
I think there is some confusion as to what I am asking. I know how to and have configured the DAC and the ADC to work separately and together. Let me try restate my problem. It is not the actual timings that are important as much as the scheme of doing things.

I would like to configure the system to initiate DMA transfers from a look up table (LUT) to the DAC at a defined and fixed rate F. At some fixed fraction of this rate e.g F/4 == fixed (4) number of DMA transfers == fixed time interval 1/(F/4) I would like to command the start of a conversion on the ADC. Once that conversion is complete (COCO flag on the ADC) I would like to initiate a DMA transfer of the result to a large buffer which will be used as a ping-pong buffer (top half filled while bottom half read out, bottom half filed while lower half read out). The ADC samples need to be very well aligned with the start and end of the waveform output on the DAC so that I know when reading on the PC side exactly how many samples correspond to one pulse.

What I know how to do:
  • Set up the peripherals
  • Properly configure the DMA channels
  • Use a completely separate timer to time the ADC sampling (e.g FTM, PIT)
  • Set up the ping-pong buffer using the DMA_TCDn_CSR[INTHALF] and DMA_TCDn_CSR[INTMAJOR] flags
What I don't know how to do:
  • Tie the ADC sample rate to the DAC so that a conversion is initiated every defined number of DMA transfers to the DAC
  • Actually attach interrupt routines to handle the above ping pong interrupts. Would it be dma_chn_inthalf_isr() ?
  • Perhaps a whole different way that can achieve what I want. Perhaps a different set of timers or scheme. Open to ideas.

Hopefully this makes what I am trying to do clearer. As mentioned in the OP, I have achieved this behavior using an Interval Timer to create an interrupt that would either ramp, begin an adc conversion or read the adc result. I was using a dispatch array with references to functions within the Interval Timer ISR to perform the various tasks. Namely:
Code:
void (*dispatchArray[intrptFuncs])(void) = { // array of handler functions to call in each interrupt. Interrupt Service Routine dispatches control to correct function
  &DAC_ramp_handler, &ADC_sample_handler,
  &DAC_ramp_handler, &nop_handler,
  &DAC_ramp_handler, &nop_handler,
  &DAC_ramp_handler, &ADC_read_handler
};
void rampDAC_isr(void) {
  dispatchArray[intrptIndex]();
  intrptIndex = ((intrptIndex+1) & intrptIndexMask); //index counts up to numInterupts-1 otherwise circles back to zero
}
The &nop_handler is a do nothing routine that is there simply to keep timings. Note how you get one &ADC_sample_handler every 4 &DAC_ramp_handler hence achieving the desired rate. Similarly note how there exist enough interrupts between &ADC_sample_handler and &ADC_read_handler so that the adc result is guaranteed accurate.
This version though did not use DMA transfers and was a very basic prototype to get the project running. Since it was interrupt based it also suffered if any operation needed to suspend interrupts. Now we are looking of doing the above better using DMA transfers to avoid the constant interrupting.
 
Perhaps a little late to the party, but...
I think the DAC interval counter might help here. See figure 35-53 in the K20 manual. My understanding is that within a single PDB loop (i.e PDB counter goes from 0 to PDB_MOD), you can update the DAC every DACINT0 cycles, and it steps through DAC0_DAT0{L,H} .. DAC0_DAT3{L,H}.

This would require that your DMA transfer, triggered by the PDB, copies 4 samples into the DAC buffer (it can hold up to 16). And of course there is some configuring on the DAC side to setup the hardware trigger and set the buffer size to 4.

I've not tried this, but I think it could be relevant for my application, so I'd be curious to hear any results from somebody trying it.
 
Status
Not open for further replies.
Back
Top