Does anyone know the time it takes from a write to the DAC for the output to reflect that?
This is the relevant code -- no DMA or anything fancy. The interrupt handler works fine, and I know that it gets called correctly once every 13 microseconds.
volatile int g_bic; // Global Bit Interrupt Counter
// in setup
pinMode(DAC, OUTPUT); // the DAC
analogWriteResolution(12); // set DAC operation to 12 bit
void start_bit_timer() {
bit_timer.begin(bit_intr_handler, BIT_TIME);
g_bic = 2047;
}
// writing to the DAC every 13 microseconds
void bit_intr_handler() {
analogWrite(DAC, data[g_bic]);
if (g_bic > 0) {
g_bic--;
} else {
bit_timer.end();
}
}
Would it be better to use DMA?
This is the relevant code -- no DMA or anything fancy. The interrupt handler works fine, and I know that it gets called correctly once every 13 microseconds.
volatile int g_bic; // Global Bit Interrupt Counter
// in setup
pinMode(DAC, OUTPUT); // the DAC
analogWriteResolution(12); // set DAC operation to 12 bit
void start_bit_timer() {
bit_timer.begin(bit_intr_handler, BIT_TIME);
g_bic = 2047;
}
// writing to the DAC every 13 microseconds
void bit_intr_handler() {
analogWrite(DAC, data[g_bic]);
if (g_bic > 0) {
g_bic--;
} else {
bit_timer.end();
}
}
Would it be better to use DMA?