All,

I've got a 14-bit hall-effect rotary encoder that I'm trying to calibrate, in software, to adjust for non-linearity. Specifically, I've got an AS5147P, which can have a non-linearity of up to +/- 0.8 degrees under optimum conditions. While the measurements are non-linear, they are consistent with respect to absolute angle.

I'd appreciate the community's thoughts on how to best perform calibration.

Aside from comparing with an equal or greater precision optical encoder, the best way I've come up to measure and adjust for non-linearity is to get the hall-sensor's magnet spinning at a slow/constant rate and record the angle/elapsed-time for each sensor increment from angle zero all the way around to zero again. With that data, I should be able to map, via timing, the recorded angles to timing-inferred real angles.

Does anyone have any better ideas?