Experimentalist
Well-known member
Can anyone advise the best way to measure the current drawn by my display using an analogue in pin?
The expected current is in the range 400-500mA, I am using a Teensy 3.1, so 3.3V analogue input tolerance.
As I see it if I used a 0.1 ohm shunt and the display consumed 400mA I would only be getting 40mV and therefore would be getting very poor accuracy considering what can be measured, that is 3.3V max.
Am I right in thinking the following?
400mA through 0.1 ohm shunt I would get:
400mA x 0.1 ohm = 0.4 x 0.1 = 0.04 = 40mV
(40mV / 3.3V) x 100% = (0.04 x 3.3) x 100% = 1.21%
So I would only be using 1.21% of the ADC resolution?
There must be a better way, probably many, but is there a simple solution? Off the shelf?
Anyone?
Ex.
The expected current is in the range 400-500mA, I am using a Teensy 3.1, so 3.3V analogue input tolerance.
As I see it if I used a 0.1 ohm shunt and the display consumed 400mA I would only be getting 40mV and therefore would be getting very poor accuracy considering what can be measured, that is 3.3V max.
Am I right in thinking the following?
400mA through 0.1 ohm shunt I would get:
400mA x 0.1 ohm = 0.4 x 0.1 = 0.04 = 40mV
(40mV / 3.3V) x 100% = (0.04 x 3.3) x 100% = 1.21%
So I would only be using 1.21% of the ADC resolution?
There must be a better way, probably many, but is there a simple solution? Off the shelf?
Anyone?
Ex.