I am constructing a controller/driver to do position and current control of a hobby grade brushless DC motor. I have everything prototyped and assembled, but I am having some issues reliably measuring current using the onboard current sense resistor and amplifier.
Current is measured on the supply side of the mosfet drive section via an 0.001ohm resistor, which is amplified by a MAX40201FAUA amplifier (gain of 50). No hardware filtration was implemented (I seriously regret not at least adding provisions to the pcb to add this). The three motor phases are controlled via pwm on pins 21, 22, 23 on a Teensy 3.6 @ 40khz. Output is sinusoidal.
The issue that I am having is that when I am trying to calibrate for cogging current at low power (pwm differential duty cycle <= 10%) I essentially need to read the voltage drop across the resistor, via the ADC on pin 16, during a very precise window of the PWM waveform (when at least one high side and one low side mosfet is turned on, separate phases of course). This means that I either need to sample at a very high rate (~4Msps) or somehow synchronize the current sense analogread to the PWM output so that it queries in the middle of the PWM waveform (where the difference in PWM output on the three phases actually cause a voltage differential across the motor), correct?
Can anyone offer any advice on this? Random sampling of the current sense voltage is giving very sporadic readings, as it samples both when the mosfets are active and inactive.
Thanks!
To clarify what I mean by differential PWM, picture two PWM waveforms controlled by a sinusoidal function. They perfectly overlap, so the mosfets they control are both either on or off at the same time, no power flows. Now if a small increment is introduced in one sinusoidal function, the waveforms stay sychronized, but they no longer overlap. This small delta introduces a differential that causes the mosfets they drive to be on or off at slightly different times. I hope it doesn't sound like I am lecturing, I just do not know the proper terminology for this.
Current is measured on the supply side of the mosfet drive section via an 0.001ohm resistor, which is amplified by a MAX40201FAUA amplifier (gain of 50). No hardware filtration was implemented (I seriously regret not at least adding provisions to the pcb to add this). The three motor phases are controlled via pwm on pins 21, 22, 23 on a Teensy 3.6 @ 40khz. Output is sinusoidal.
The issue that I am having is that when I am trying to calibrate for cogging current at low power (pwm differential duty cycle <= 10%) I essentially need to read the voltage drop across the resistor, via the ADC on pin 16, during a very precise window of the PWM waveform (when at least one high side and one low side mosfet is turned on, separate phases of course). This means that I either need to sample at a very high rate (~4Msps) or somehow synchronize the current sense analogread to the PWM output so that it queries in the middle of the PWM waveform (where the difference in PWM output on the three phases actually cause a voltage differential across the motor), correct?
Can anyone offer any advice on this? Random sampling of the current sense voltage is giving very sporadic readings, as it samples both when the mosfets are active and inactive.
Thanks!
To clarify what I mean by differential PWM, picture two PWM waveforms controlled by a sinusoidal function. They perfectly overlap, so the mosfets they control are both either on or off at the same time, no power flows. Now if a small increment is introduced in one sinusoidal function, the waveforms stay sychronized, but they no longer overlap. This small delta introduces a differential that causes the mosfets they drive to be on or off at slightly different times. I hope it doesn't sound like I am lecturing, I just do not know the proper terminology for this.