Hello!
This is my first post here, so I hope that I've put this thread in the right place!
I suppose this is a general question, but I'll ask it in the context of my specific project.
The general question is: Is there typically a significant rise/fall time for analog readings? (By significant I mean on the order of milliseconds).
—
Here are the specifics:
I am building a musical instrument of sorts, and one of the components is a touch sensor that is used to determine (one-dimensional) finger position along a surface. The touch sensor I am using is a linear soft potentiometer (https://www.sparkfun.com/products/8681). I've wired it up in a simple voltage divider circuit so that when my finger is not touching the sensor, it is pulled high:
I needed to determine finger position with extremely high accuracy, so I'm using an external ADC—an ADS1115 (https://www.adafruit.com/product/1085).
All of this is hooked up to a Teensy 4.0.
I am using MegunoLink (an awesome software for plotting and debugging by the way!) to plot analog readings of finger position.
I am looking at the falling & rising edges that occur when a finger is placed down or lifted off of the sensor, and it appears that there is a significant amount of rise & fall time.
For example, here is the output (analog reading vs. time) graph for when a finger is removed from the sensor. I am lifting my finger straight up from the sensor without sliding position.
Data on this graph is collected at a rate of roughly ~2 milliseconds per data point.
Here's a zoom view of the rising edge:
Looking at the data, it appears that it takes about 4 ms for the reading to go from a value of 9349 back to its pulled high state of 32676.
I'm a bit confused about this behavior and I'm hoping someone here much wiser than I can shed some light.
I was not expecting to be able to detect a "rise time" at the timescales I'm measuring at, since I know that digital signals can rise in ~nanoseconds which should be way faster than what we're looking at here.
My first thought to explain this was that maybe there was something in the mechanics of the physical wiper that would cause it to "drag" across other positions when the finger releases, but honestly that doesn't make any sense. I've taken apart one of these sensors in the past, and all it is is a conductive wiper that's raised above a resistive material. When you place your finger on the sensor, you connect the wiper to the resistive material at a specific point, and when you lift it off, it should disconnect fully. I have taken measurements along the sensor for calibration purposes, so I know that the value of 9349 corresponds to about 17" down the length of the sensor, and the value 15091 corresponds to a distance of about 14" down the length of the sensor. So that "intermediary" value of 15091 would represent a point about 3" up the string from where I actually had my finger placed before I lifted it. I don't see any way that something 3" up from my finger position would be making contact with the sensor when I lift my finger up.
What do you folks think could be causing this rise time?
I suppose in my mind it could be two things, either:
1. The voltage *itself* actually rises at a very slow rate. (In which case I would have no idea why.)
2. The voltage rises almost instantaneously, but the ADC is slow to respond to rapidly changing input values.
Any thoughts would be greatly appreciated! Thanks!
—
Actually, right before posting this, I realized that maybe while lifting my finger, there could be a point in time in which the wiper is *partially* touching the resistive strip below it, so that it's forming an imperfect connection (with some effective resistance), that's causing that intermediary reading during the rising signal. That currently seems the most plausible to me, but would still love to hear your thoughts! Thanks!
This is my first post here, so I hope that I've put this thread in the right place!
I suppose this is a general question, but I'll ask it in the context of my specific project.
The general question is: Is there typically a significant rise/fall time for analog readings? (By significant I mean on the order of milliseconds).
—
Here are the specifics:
I am building a musical instrument of sorts, and one of the components is a touch sensor that is used to determine (one-dimensional) finger position along a surface. The touch sensor I am using is a linear soft potentiometer (https://www.sparkfun.com/products/8681). I've wired it up in a simple voltage divider circuit so that when my finger is not touching the sensor, it is pulled high:
I needed to determine finger position with extremely high accuracy, so I'm using an external ADC—an ADS1115 (https://www.adafruit.com/product/1085).
All of this is hooked up to a Teensy 4.0.
I am using MegunoLink (an awesome software for plotting and debugging by the way!) to plot analog readings of finger position.
I am looking at the falling & rising edges that occur when a finger is placed down or lifted off of the sensor, and it appears that there is a significant amount of rise & fall time.
For example, here is the output (analog reading vs. time) graph for when a finger is removed from the sensor. I am lifting my finger straight up from the sensor without sliding position.
Data on this graph is collected at a rate of roughly ~2 milliseconds per data point.
Here's a zoom view of the rising edge:
Looking at the data, it appears that it takes about 4 ms for the reading to go from a value of 9349 back to its pulled high state of 32676.
I'm a bit confused about this behavior and I'm hoping someone here much wiser than I can shed some light.
I was not expecting to be able to detect a "rise time" at the timescales I'm measuring at, since I know that digital signals can rise in ~nanoseconds which should be way faster than what we're looking at here.
My first thought to explain this was that maybe there was something in the mechanics of the physical wiper that would cause it to "drag" across other positions when the finger releases, but honestly that doesn't make any sense. I've taken apart one of these sensors in the past, and all it is is a conductive wiper that's raised above a resistive material. When you place your finger on the sensor, you connect the wiper to the resistive material at a specific point, and when you lift it off, it should disconnect fully. I have taken measurements along the sensor for calibration purposes, so I know that the value of 9349 corresponds to about 17" down the length of the sensor, and the value 15091 corresponds to a distance of about 14" down the length of the sensor. So that "intermediary" value of 15091 would represent a point about 3" up the string from where I actually had my finger placed before I lifted it. I don't see any way that something 3" up from my finger position would be making contact with the sensor when I lift my finger up.
What do you folks think could be causing this rise time?
I suppose in my mind it could be two things, either:
1. The voltage *itself* actually rises at a very slow rate. (In which case I would have no idea why.)
2. The voltage rises almost instantaneously, but the ADC is slow to respond to rapidly changing input values.
Any thoughts would be greatly appreciated! Thanks!
—
Actually, right before posting this, I realized that maybe while lifting my finger, there could be a point in time in which the wiper is *partially* touching the resistive strip below it, so that it's forming an imperfect connection (with some effective resistance), that's causing that intermediary reading during the rising signal. That currently seems the most plausible to me, but would still love to hear your thoughts! Thanks!