TLV320AIC3105 Codec in TDM mode

I have it working just fine in I2S mode.

bool AudioControlTLV320AIC3105::aic3105_initCLK (select_wire wires, device dev){

//Table 12. Page 0/Register 7: Codec Datapath Setup Register
writeRegister(wires, dev, Page_00, 0x07, 0b10001010);// w 30 07 8A P0_R7 0b10001010
//Table 13. Page 0/Register 8: Audio Serial Data Interface Control Register A
// D5 R/W 0 Serial Output Data Driver (DOUT) 3-State Control
// 0: Do not place DOUT in high-impedance state when valid data is not being sent.
// 1: Place DOUT in high-impedance state when valid data is not being sent.
writeRegister(wires, dev, Page_00, 0x09, 0b00100000);
// Table 14. Page 0/Register 9: Audio Serial Data Interface Control Register B
writeRegister(wires, dev, Page_00, 0x09, 0b00000000);
//Table 100. Page 0/Register 102: Clock Generation Control Register
//writeRegister(wires, dev, Page_00, 0x66, 0b00000010);// w 30 66 A0 P0_R102 0b10100000
return true;
}
In TDM mode the offset is "1". So the DAC responds perfectly for each slot with 16 Bit Mode: device 1 shift 1 | device 2 shift 33 | device 3 shift 65 | device 4 shift 97.

But the ADC doesn't seem to be following the same offset? The following is my enableTDM code. Below I have it set as device 2 (bit 33) as I'm experimenting moving it around the TDM slots to see if anything changes...
bool AudioControlTLV320AIC3105::aic3105_enableTDM(select_wire wires, device dev) {

//Table 14. Page 0/Register 9: Audio Serial Data Interface Control Register B
//Set to DSP mode
//Specify 16 bit word length.
writeRegister(wires, dev, Page_00, 9, 0b01000111);

//Table 15. Page 0/Register 10: Audio Serial Data Interface Control Register C
//Set the Offset 1 bit clock - 255 bit clocks.
// 16 Bit Mode: device 1 shift 1 | device 2 shift 33 | device 3 shift 65 | device 4 shift 97
writeRegister(wires, dev, Page_00, 10, 33);

return true;
}

I'm pretty sure that the Audio Library expects 16 bit widths, but spaced on 32 bit spaces. This is pretty easy to adjust in the design tool, just skip a slot or not. But what about that 1 bit offset? Is the offset the same on both ADC and DAC? This codec only has one register to adjust the offset. So both ADC and DAC are moved over X bits.

Here is a recap of what I'm doing.
  • 16 bit mode
  • "DSP" mode which is TDM
  • 1 bit offset.
Both the DAC and ADC work find in I2S mode (proving my connections and configurations correct). The DAC works fine in TDM mode but the ADC does not. I get a loud scratchy noise from it which means that something is off with the timing.

My custom boards have some 74LVC1G125GV buffers, but I've tested with them enabled and disabled/bypassed and the same situation occurs.

Any idea?
 
Last edited:
Just as some notes on my ongoing research.

I think the next thing I'm going to try is to use h4yn0nnym0u5e's audio library fork to see if it changes anything.

I was thinking about also trying chipaudette's Float32 library but that fork doesn't have TDM mode.

I just have to remember how to bypass the libraries in VS Code / Platform.io...
 
Haven’t touched much of the TDM code, except to adopt a fix I found languishing un-merged in the PR pile (Issue#429). I may take a look at the datasheet and code, see if I can spot anything. I only have a CS42448 to test with, though.
 
I know you found a backwards bit in the TDM code which put the bits out of order. I read somewhere once in a lifetime.

The only thing that could be hardware related with this is that I have a buffer on the I2S lines. My custom main card is buffering the mclk, bclk, lclk, sdout, sdin(reversed). Then the sub module is sending the sdout back to the main board via a buffer as well. This afternoon I bypassed the main board buffer so that the sdout was only buffered on the sending end. But I haven't tried bypassing the buffer altogether. Technically the sdout is therefore going to be x nanoseconds behind the clock because it receives the clock and sends the signal out through the buffer.

It's a very small number which can be derived from the 74LVC1G125GV datasheet as 2.1ns @ 3.3v. it works fine with I2S, could it be this with presumably higher speed TDM?

EDIT .5 min 2.1-4.3ns max
 
Last edited:
I just bypassed the buffer altogether - it is the same. Still scratchy like the ADC is sending data at the wrong start bit. I asked TI and they said that the chip adjusts both DAC and ADC together with that same register (Table 15. Page 0/Register 10). I guess it's time to get out the oscilloscope to take a look at it. I suspect that whatever format the 3105 is sending over TDM isn't what the audio library expects somehow.
 
First shot from the scope.

Yellow = LRCLK
Purple = DAC Data (out of mainboard, into 3105)
Green = ADC Output directly from 3105
Blue = ADC Output after buffer

PXL_20231228_153151679.jpg


Based on this. It does appear that the buffer has nearly zero delay to be concerned about.

In this next image I wanted to see the BCLK so I changed green to that.

Yellow = LRCLK
Purple = DAC Data (out of mainboard, into 3105)
Green = BCLK
Blue = ADC Output after buffer

20231228_110813.jpg


Looking at the PURPLE DAC and the BLUE ADC. It appears that the ADC is coming in yet another 1 bit further than the DAC. I'm not sure if that is to be expected or not. But if this is the case then I would probably need to adjust the Audio Library to receive the ADC at an offset of 2... I don't think my buffers are causing any of that delay because I tested it on both pre and post buffer outputs. The ADC seems to be offset 2 and the DAC seems to be offset 1. I'll also ask TI.
 

Attachments

  • 20231228_104910.jpg
    20231228_104910.jpg
    343.6 KB · Views: 46
Fixed the BCLK (disconnected ground on the probe). And played a 1kHz sine wave from USB>Mainboard>3105> and from Phone>3105>Mainboard.

Purple is DAC and BLUE is ADC... It seems off to me. The ADC is too far right. TI said this shouldn't be the case. Still trying to find a solution there to.

20231228_125020.jpg
 
I2S mode works great with both DAC and ADC. It sounds good. But doesn't this look funny still with the data lines seemingly behind the clock. It should be on the rising edge, right?

Screenshot 2023-12-28 144231.png
 
Those scope traces are way too slow to transition, perhaps you have your 'scope probes set to x1 rather than x10 (which is required for fast signals).
 
Hello MarkT,

Thanks for the advice. I didn't know. But you are right, it looks better maybe when 10x is enabled on the probe, and 10x is also enabled on the scope. I presume that's how it's done. My yellow probe is broken, so the 10x switch doesn't work. So I am showing just one signal line at a time.

TDM example, ADC LINE.
PXL_20231229_103229464.jpg


TDM EXAMPLE, DAC LINE.
PXL_20231229_103337703.jpg


I2S EXAMPLE ADC LINE
PXL_20231229_104012333.jpg

I2S EXAMPLE DAC LINE
PXL_20231229_104030633.jpg


There is clearly an issue with the TDM mode. The ADC is being sent late.
 
There must be some mismatch in my ADC configuration. The 3105 must be putting the ADC on the wrong bitrate or something. Hmm.

TI did mention that they find it unlikely that the clocks are at fault. If it works for I2S mode then it should work for TDM mode.
 
It doesn't look out of specification. For the TLV320AIC3105, td(DO-BCLK) is specified as 20ns maximum at IOVDD=3.3V (section 8.6 on p11), and it looks less than that. For the i.MX RT1060 SAI interface, "S9: SAI_RXD/SAI_FS input setup before SAI_BCLK" is 15ns minimum (table 50 on p60 of IMXRT1060CEC.pdf), and it looks like you've got about 30ns. Assuming here that the green trace is BCLK, and the blue is what you're calling "ADC LINE" aka DO from the TLV320 to SAI_RXD on the i.MX RT.

You do know you can do screen captures to a USB stick on that 'scope, don't you? ;)
 
You do know you can do screen captures to a USB stick on that 'scope, don't you? ;)
Yes but my phone auto uploads to the Internet for easy retrieval. As opposed to swapping devices. But eh, maybe I'll try that next time. Thanks!

If it's in spec, what else can I look at? Thanks for taking the time! FYI, I have a TLV320ADC6140 using the exact same I2S bus that works fine. I've removed it for the tests.

Yes I'm calling it ADC LINE because when you are referring to multiple devices I find it can get confusing when you say "in" or "out". If I do say that I'm usually thinking in terms of the MCU (in or out of it).

Jay
 
Good plan. Understood about your nomenclature, but I hate making the assumption I'm right in my interpretation of what others say ... it can go so horribly wrong!

Just digging into the data sheet more ... can't find the place where it says you can do 256-clock mode as a slave. Also, your 'scope trace looks like the word clock pulse is half a clock different from the datasheet - wondering if that's relevant.
1703871976029.png
 
Hello MarkT,

Thanks for the advice. I didn't know. But you are right, it looks better maybe when 10x is enabled on the probe, and 10x is also enabled on the scope. I presume that's how it's done. My yellow probe is broken, so the 10x switch doesn't work. So I am showing just one signal line at a time.
x10 mode uses a capacitive divider in parallel with a resistive divider to maintain good bandwidth, x1 mode acts as an RC low pass filter limiting the bandwidth by around an order of magnitude typically. 'Scope probes are very clever.
 
Just digging into the data sheet more ... can't find the place where it says you can do 256-clock mode as a slave. Also, your 'scope trace looks like the word clock pulse is half a clock different from the datasheet - wondering if that's relevant.

I'm thinking that the word clock should rise when the bitclock rises, and that's the reason for the issues. Is there a setting somewhere to change the clock timing on the i.MX RT1060 SAI?

While listening more to the DAC, I would say it also is negatively impacted in terms of sound quality over I2S.
 
According to the TLV320AIC3105 datasheet, WCLK must *NOT* rise together with BCLK.

You can see in figure 3 that WCLK, SDOUT, SDIN are expected to be stable during the BCLK rising edge.

1703963553566.png


Like all synchronous clocked digital signals, it has a setup time where the signals must be stable before the clock edge (rising in this case) and a hold time where it must remain stable after the clock edge.

With the BCLK period being 354ns, and the setup+hold time being only 16ns, the WCLK (LRCLK) signal is allowed to change at any time during the other 338ns. It only is required to be the correct voltage 10ns before the BCLK rising edge and remain so for 6ns after.

Hopefully it's obvious that having LRCLK change at or very soon after the BCLK falling edge meets this requirement. That also happens to be the way it's depicted in Figure 3 (though as a matter of documentation style, I personally feel showing setup and hold relative to the same clock edge makes a much clearer impression than the way someone at TI chose to draw this diagram). Having WCLK (LRCLK) change together with BCLK would absolutely violate the required setup and hold time. It only has to be stable for 16ns, right at that moment of the BCLK rising edge!

1703964070365.png
 
I'm thinking that the word clock should rise when the bitclock rises, and that's the reason for the issues. Is there a setting somewhere to change the clock timing on the i.MX RT1060 SAI?
You can change the polarity, and it looks as if it might be worth a try. In output_tdm.cpp there are lines to set the Bit Clock Polarity, e.g. on this line, also here. i.MX RT1060 Reference manual sections 38.5.1.6.4 and 38.5.1.15.4
 
The diagram in msg #17 should probably be considered only a concept illustration. It shows the word clock rising at the same time as the bit clock rising, which is clearly wrong timing.

Look at figure 14 in the TLV320AIC3105 datasheet if you need better diagram showing the entire frame rather than per-clock timing details.

1703964539895.png
 
Did you get it working? I'm trying to figure out how to get a digital microphone with TDM output working with an TLV320AIC3120 and Teensy.
 
Back
Top