Wrong voltage value

Status
Not open for further replies.

kammateo

Member
Hi,
I'm getting the wrong voltage value from an analog input. With my multimeter (and with an Arduino mega) I got 2.565V. Using my T3.5 in the serial monitor I get 3.87V and my multimeter got 2.565V. Any ideas?
Code:
float sensorValue = 0;
void setup(void)
{
Serial.begin(9600);
}

void loop(){
sensorValue = ((analogRead(A8)*5.0)/1023.0);
Serial.println(sensorValue,5);
}
 
Kammateo,

While the T3.5 is 5V tolerant, the A/Ds only read to 3.3 V.

Try changing the conversion line to:
Code:
sensorValue = ((analogRead(A8)*3.3)/1023.0);

Note that it will fix at 3.3V with any input beyond 3.3V.

You can also consider adding the following lines to your setup subroutine:
Code:
analogReadRes(12);
analogReadAveraging(10);
This will fix your resolution at 12 bits, and average 10 readings together.
 
Last edited:
Woozy, thank you very much for your reply. I changed the conversion line and it works. I'm using the example LowLatencyLoggerMPU6050 from SDfat. When you said:

You can also consider adding the following lines to your setup subroutine:
Code:
analogReadRes(12);
analogReadAveraging(10);
This will fix your resolution at 10 bits, and average 10 readings together.
Do I have to use the ADC library? Because I need to read this analog input at 200Hz.
 
Sorry,
I don't have experience with either the example LowLatencyLoggerMPU6050 from SDfat, or the ADC Library.
 
Status
Not open for further replies.
Back
Top