PCT robot arm, alternative to inverse kinematics

Status
Not open for further replies.

amatic

Member
Hi guys,
I made a robot arm based on principles of perceptual control theory, developed by W.T. Powers. As the title says, the arm can position the endpoint in space without the use of inverse kinematics. Instead, it uses a cascade, or a hierarchy of feedback loops. Positioning is not terribly precise, but I guess it can be improved.

IMAG0072-640x640.jpg

The arm has 5 degrees of freedom. It is made from a 5mm foam plastics board (aka forex plastics). Each joint has one or two geared micro motors, and a potentiometer which measures angular position of the joint. A Teensy 3.1 board is reading out potentiometer values, doing the calculations, and sending outputs to TB6612FNG motor drivers. The setup is operating at 12 V.

Calculations on the Teensy involve two levels of feedback loops. First level is simple proportional control for each joint angle. Inputs for the second level are calculated using basic trigonometric functions and represent reach, elevation and lateral displacement of the end point, as well as hand roll and hand pitch. Outputs of these second order loops are directly varying setpoints of first order loops.

Here is the complete source code, it is still in production since I'm trying to add 'touch' perception to the hand, and there will be a simple visual system using openCV. I'll post that when it's done.
View attachment ArmTeensy.ino
View attachment Kinematics.ino

Structure of the feedback loops is explained in this paper:
http://www.livingcontrolsystems.com/demos/arm_one/arm_one_win_calc.pdf
There is full simulation with source code in Delphi here: http://www.billpct.org/

I love the Teensy 3.1 board. I'm a newb in electronics and I've fried two Maple boards on this project, and the Arduino is a bit too slow. 5V tolerant inputs are really helpful.
 
Here is a diagram of all three levels of feedback loops:
T0zOMoT.png



If something is unclear, feel free to ask, I'd be glad to explain.
 
Last edited:
Screenshot of OpenFrameworks program with cursor and target position detection by color filtering. The program takes a picture from the camera, finds the distance of cursor and target, compares it to reference distance, and sends the error as reference signal to lower systems - y to reach and x to lateral control.
4EZWlGs.jpg



Cursor "object" is moved by the robot arm, target by me. The system is roughly maintaining reference distance of cursor from target by moving the cursor.
 
Last edited:
Status
Not open for further replies.
Back
Top