SPI DMA once again. Data shifted turning on LSB first mode

honey_the_codewitch

Well-known member
Solved:
Unfortunately I'm not sure what I did. I went back and retooled my code to remove, and then reinsert this functionality and it worked. *shrug*

Sorry for the churn folks. I had been wrestling with this all morning.

Previous issue:
I have a different problem now. I didn't change the TCR_MASK to allow for LSB first before, which was causing the problem. The code posted below now works except for the problem I outline below:

The data seems to be getting shifted somehow in LSB first mode. It's hard to put my finger on but RGB565 red is coming out green which is really peculiar. It's not the only color that's messed up, but in my tests it's the easiest to see.

Can anyone think of why this might be happening?

Previous issue before that:
Once again, I'm struggling with DMA. This time I'm simply trying to avoid byte swapping uint16_t color values I'm sending to the LCD.
I can do it in 8-bit mode, but I'd rather use 16-bit mode for efficiency

Pertinent bit is lines 30-31. If they're not included everything works but color swapping happens. With the lines present, it hangs.

I'm missing something. How do I turn on LSB first for the duration of my color transfer? (I want MSB first for everything else)

C++:
bool lcd_spi_driver_t4::flush16_async(int x1, int y1, int x2, int y2, const void* bitmap, bool flush_cache) {
    // Don't start one if already active.
    if (_dma_state & LCD_SPI_DMA_ACTIVE) {
        Serial.println("DMA IN PROGRESS");
        return false;
    }
    _buffer = (uint16_t*)bitmap;
    int w = x2-x1+1;
    int h = y2-y1+1;
    _count_words = w*h;
    size_t bytes = _count_words * 2;
    if(flush_cache) {
        arm_dcache_flush((void*)bitmap,bytes);
    }
    if(!init_dma_settings()) {
        _dma_data[_spi_num]._dmasettings[0].sourceBuffer((const uint16_t*)_buffer, bytes);
    }
    // Start off remove disable on completion from both...
    // it will be the ISR that disables it...
    _dma_data[_spi_num]._dmasettings[0].TCD->CSR &= ~(DMA_TCD_CSR_DREQ);
    _dma_data[_spi_num]._dmasettings[1].TCD->CSR &= ~(DMA_TCD_CSR_DREQ);
    begin_transaction();
    write_address_window(x1,y1,x2,y2);
    // Update TCR to 16 bit mode..
    _spi_fcr_save = _pimxrt_spi->FCR;  // remember the FCR
    _pimxrt_spi->FCR = 0;              // clear water marks...
    uint32_t tcr = LPSPI_TCR_PCS(1) | LPSPI_TCR_FRAMESZ(15) | LPSPI_TCR_RXMSK; /*| LPSPI_TCR_CONT*/
 
    tcr |= LPSPI_TCR_LSBF;
 
    maybe_update_tcr(tcr);
    _pimxrt_spi->DER = LPSPI_DER_TDDE;
    _pimxrt_spi->SR = 0x3f00;  // clear out all of the other status...
    _dma_data[_spi_num]._dmatx.triggerAtHardwareEvent(_spi_hardware->tx_dma_channel);

    _dma_data[_spi_num]._dmatx = _dma_data[_spi_num]._dmasettings[0];

    _dma_data[_spi_num]._dmatx.begin(false);
    _dma_data[_spi_num]._dmatx.enable();

    _dmaActiveDisplay[_spi_num] = this;
    _dma_state &= ~LCD_SPI_DMA_CONT;
    _dma_state |= LCD_SPI_DMA_ACTIVE;
    return true;
}
 
Last edited:
I've used the DMA controller several times, and it pretty much always seems to go this way with quite a bit of experimenting and fiddling to figure out how to get the results I want.
 
On to the next struggle. I made some LVGL compliant drivers (partial screen updates using DMA) and a base class to make it easy to implement more. It works for the Teensy 4.x presently. Trouble is I can't get it to work for the ILI9341 which is odd, because aside from the initialization sequence it behaves similarly to the ST7789 in terms of the commands it takes - which works in my code. I'm thinking it's a timing issue, so I'm once again poring over your code to see what the sauce is to make this work. Hopefully I can reintegrate what I find into the base class.
 
Back
Top