Help with servo smoothing

Status
Not open for further replies.

muggins

Well-known member
Please forgive me if this has been covered elsewhere.. I can't find anything relevant.
I am using a servo to drive a large camera with a fair bit of inertia. My goal is to have acceleration and deceleration between start and end points. I have been messing around with sin wave equations and varying length of "metro" intervals. Is this the correct approach? My brain is hurting.... has anyone else cracked this?
thanks.
 
How exactly do you move the camera (please provide a drawing or a sketch) and what exactly do you need?
  • Limit the force or moment acting on the servo: this would limit the acceleration to a certain value
  • Smooth (jerk-free) motion: this would limit the acceleration's time derivative, see https://en.wikipedia.org/wiki/Jerk_(physics)
 
Hi
I'm not great with drawing but.... the camera is a specialist device weighing about 5Kg. It is well balanced and is being driven directly by the servo.
I think that the path I need to pursue is to increment pulse width over variable time intervals.
 
I'm sure you can at least give us a rough representation of what your servo is doing. My interpretation of
being driven directly by the servo
is this:
DSC_0718.jpg
Does that describe your mechanical setup?
 
OK, then as soon as you give the servo a position input that differs from its current position, it will start moving towards that position. How exactly it does this depends on its internal controller, which is most probably a proportional controller tuned in some or the other way. It will accelerate in the right direction until it is idling (moving at a constant angular rate, that is, because the camera has been accelerated and then moves on due to its inertia) and then eventually start to decelerate just before it reaches the desired position corresponding to your input. When it has reached that it might overshoot and swing a bit, especially with a high inertia attached to it.

So whatever you implement in your acceleration algorithm, you might see some "rattling" because the camera moves to intermediate positions and decelerates before your algorithm can give the servo a new one. If your algorithm accelerates too fast, the servo will simply not see any ramp-up of your position command and move to the last position given, regardless of any preset speed you're trying to get by software. Your algorithm should operate between these extremes in order to have any smoothing effect.

With an algorithm that is executed every 20 ms (the usual update rate of servo signals), the simplest you can do is keep velocity and speed as state variables. Then, in every iteration, adjust the speed to get a bit closer to the desired speed. The difference between old and new speed is the acceleration. Add the speed to the current position and output that to the servo. You have two parameters that can be adjusted to work well with your servo/camera combo: acceleration and speed. If you get both right, this might result in very smooth movement.

You can also simplify it a bit and work with a fixed speed. In that case, if your position variable is not what you want it to be, simply add some value to it until it has reached the desired value:
Code:
(every 20 ms) {
  if(x < target)
  {
   x = min(target, x + velocity)
  }
  else if(x > target)
  {
   x = max(target, x - velocity)
  }
  servo_out(x)
}

Rephrasing it: Your servo is a rabbit and you have the carrot. You need to figure out how to move the carrot forward to keep it within a given distance interval from the rabbit. The fact that the servo has hard speed limit makes this a pretty non-linear system that is hard to control. Unfortunately.
 
Hi Christoph. Thank you for the response. Given that I have no control over velocity directly I have devised this (very rough and uni directional and using delays)
It uses a sine wave shaping. That is the pulse width iterates by one each loop and the delay between each is long at either extreme and short in the middle.
I somehow need to get rid of the delays as there is other processing. I am a bit of a novice, but do you think this is feasible?
thanks
John


if (targetPos != CurrPos){
DEBUG_PRINTLN("STARTING MOVE#############################");
DEBUG_PRINT("DIFFERENCE IN POSITIONS: ");DEBUG_PRINTLN(targetPos-CurrPos);
for(int i=CurrPos; i<targetPos; i++)
{
float i2=map(i,CurrPos,targetPos,0,180);
float JMdelay = 1-sin(i2*pi/180);
DEBUG_PRINT( "position: " );
DEBUG_PRINT(i);

DEBUG_PRINT( " delay: " );
DEBUG_PRINTLN(JMdelay*10000);
JMservo.writeMicroseconds(i+900 );
delayMicroseconds(JMdelay*10000);

}
CurrPos = targetPos;DEBUG_PRINTLN("FINISHED #############################");
}

}
 
this is how I usually write delay-less algorithms:
Code:
... other processing here...

static const uint8_t servoAlgoPeriod = 20; // 20 ms period, or 50 Hz
static elapsedMillis servoAlgoTimer = period; // initialize with period: execute algorithm immediately after startup

if (servoAlgoTimer >= servoAlgoPeriod)
{
  servoAlgoTimer = 0; // you can also do servoAlgoTimer -= servoAlgoPeriod, with slightly different implications
  executeServoAlgo();
}

... other processing here ...

Your code seems to adjust the delay between servo position updates, which is different from what I'm outlining above. You can probably change your calculation to use a fixed period and variable position updates instead - this should be easy. You can also keep the variable delay and adjust my code example. Anyway, use elapdsedMillis() instead of delay().
 
I'm familiar with elapsedMillis but just haven't worked out the logic of introducing it into my "for" loop.
Using fixed intervals, for me anyway makes it difficult to calculate how may time intervals there should be during the move and how many servo microsecond jumps there should be during each interval to follow the sine wave trend. Anyway I'll have a go.
 
I could try to help you further if you provide information about that map() function or at least a general description of the type of movement you want to achieve. The expression 1 - sin()... can result in any value between 0 and 1 and that is what goes into your delay call, and tells me that there is room for improvement because updating the servo value with a delay of less than 20 milliseconds will probably not have any effect at all
 
I realize it all needs tidied up, but roughly, the map splits the difference between target and current and divides it into 180 sections. These are then made to follow a sine curve. This in theory gives a slow start and finish to the traverse. I'm not sure if this is a good method and wonder if it would be better to ramp up to speed and ramp down linearly at the end. Any help much appreciated.
 
It's still not clear to me what you want the motion to look like. Can you draw a plot showing the desired velocity over time? You can probably simplify things if you don't go for a sine, but constant acceleration. Here's a plot of v(t) = sin(t) and a simplified one with constant acceleration:
save.png
 
Hi
I don't really know which is the better option but I'm opinion the sine one is easier to implement as it requires only one formula.
If I go for the linear then I will need something like:

if diff <= ramp duration
then ramp up to (diff/max) * max then down again

if diff > 2 * ramp duration
then ramp up to max then linear for a while and then ramp down to 0

if diff < 2 * ramp duration
then ramp up to diff*max/(2* ramp duration) and then ramp down again

As I said I don't know the best approach to this and am only guessing.
I will respond again tomorrow as I am cooking chicken for my mother in law (mother's day)

thanks
 
I need help please as the logic of this evades me...
This is the flow in plain English. I don't know how to utilize elapsed milli delay instead of blocking delay in this scenario.
This all runs within the loop and I somehow need to protect the for-next-loop from restarting each loop. Elsewhere in the programme targetPos gets updated.

Within the loop......

If targetPos > currentPos
then
diff= targetPos-currentPos

for i = currentPos to targetPos

calculate a delay (using i and diff)
currentPos+=i
move servo to currentPos
delay by calculated delay ....................... need to use elapsed millis

next i
End if

Any help much appreciated
John
 
Think in terms of "when it's time to update the position, do that".

You for-loop iterates from the current position to the final position all in one go. Before this is done, you update diff. So when turning this into non-blocking code, you need to preserve some of this behavior. Especially the diff must not be recalculated as you move the servo, as it would get smaller and smaller after every position update and mess up your timing.

1) you need a function that updates the target and the "diff" and leaves both values untouched unless needed.
2) make your loop non-blocking:

(sketchy)
Code:
static uint8_t nextDelay = 0; // static makes sure that these are not re-initialized to zero in every main loop.
static elapsedMillis servoTimer;
if((targetPos > currentPos) && (servoTimer >= nextDelay))
{

  nextDelay = calculate a delay (using i and diff)
  servoTimer = 0
  currentPos+=1
  move servo to currentPos
  if (currentPos == targetPos)
  {
    nextDelay = 0; // we need this to properly restart the "loop" upon the next target update
  }
}

Draw what the loop does on a piece of paper, iteration after iteration. Eventually you'll get it.
 
Christoph
Thank you so much for taking the time to look at this. I'm normally not bad with logic and flows but this one has given me a headache.
I will sketch this one on paper and try to follow the flow. (tomorrow I think)
John
 
Christoph
I don't think this will work because it recalculates a new delay with each iteration and loses track of the original difference in positions.
The calculation works out the ramp up and ramp down delays. For example if there is a call for a target of 20 and it is currently 0 then the delay pattern would be:
10,9,8,7,6,5,4,3,2,2,2,2,3,4,5,6,7,8,9,10
This piece of code loses that functionality as it can only look at the difference in positions at any given iteration.
I'm sure there is a way to do this but I can't see it.....
 
That's what I mean by

"1) you need a function that updates the target and the "diff" and leaves both values untouched unless needed."

Your code should not accept new target input while the servo is moving to the target position.
 
Christoph
Sorry you did in fact make that clear. My mistake. I jumped straight into your code without reading your post again. So I need another function within the loop containing a boolean "isMoving"
 
No problem, you're interested in solutions - not the accompanying bla bla (I have that problem from time to time as well).

It would be even better to put all this into a class that hides the details and only allows to set the position. You also need to make up your mind about what happens when the application tries to update the target while the servo is moving.
 
Christoph
Putting this in a class would indeed be an elegant way to present this. However despite having spent a couple of decades doing VB programming and now having attacked this code with loads of "Serial.prints" I just cannot find a way to make it workable.
 
Well you seem to be a bit frustrated, but please provide a problem description that goes beyond "I just cannot find a way to make it workable". As always: post your code, describe what you expect it to do and what it does instead. We have no way of helping you without that!
 
We have the "Forum Rule" here because it really does save everyone a lot of time and frustration when complete code samples (which can be copied and pasted into Arduino and run on real hardware) are posted.

Here, let me demonstrate. Here's a complete program which you can copy and paste into Arduino and run on your Teensy:

Code:
int servoTargetPosition=0;
int servoActualPosition=0;
elapsedMillis msecSinceLastUpdate;

void setup() {
  while (!Serial) ; // wait for serial monitor
  Serial.println("slow servo move test...");
}

void setTargetPosition(int pos)
{
  servoTargetPosition = pos;
  Serial.print("Set Target Position To: ");
  Serial.println(servoTargetPosition);
}

// call this rapidly...
void moveSlowly()
{
  // move no faster than 1 degree every half second
  if (msecSinceLastUpdate < 500) return;
  msecSinceLastUpdate = 0;

  // nothing to do if already at correct position
  if (servoActualPosition == servoTargetPosition) return;

  // increment or
  if (servoActualPosition < servoTargetPosition) {
    servoActualPosition = servoActualPosition + 1;
  } else {
    servoActualPosition = servoActualPosition - 1;
  }
  Serial.print("Servo move to: ");
  Serial.println(servoActualPosition);
  // TODO: actually control a servo motor here....
}

void delayWhileMoving(int milliseconds)
{
  elapsedMillis ms=0;
  while (ms < milliseconds) {
    moveSlowly();
  }
}

void loop() {
  setTargetPosition(90);
  delayWhileMoving(50000);
  setTargetPosition(20);
  delayWhileMoving(10000);
  setTargetPosition(120);
  delayWhileMoving(100000);
}

Hopefully this helps?

Please, if you need help with code, ask questions with complete programs posted. You'll get much better help with a lot less frustration!
 
Status
Not open for further replies.
Back
Top