Controlling Robot Hands

Magnus Johnsson and Christian Balkenius

The modelling of haptic perception as well as the implementation of haptic perception in robots are two neglected areas of research. Robot hand research has mainly focused on grasping and object manipulation and many models of hand control have been focused on the motor aspect rather than on haptic perception. To investigate haptic perception in robotic systems, we have built three robotic hands.

The software for the LUCS haptic hands was developed in C++ and runs within the Ikaros system, which provides an infrastructure for computer simulations of the brain and for robot control.

LUCS Haptic Hand I

The first system (Johnsson & Balkenius, 2006) was a system for haptic size perception. It used a simple three-fingered robot hand, the LUCS Haptic Hand I, with the thumb as the only movable part. The LUCS Haptic Hand I was equipped with 9 piezoelectric tactile sensors. This system used self-organizing maps (SOM) and a neural network with leaky integrators and it successfully learned to categorize a test set of spheres and cubes according to size.

Hand 1
LUCS Haptic Hand I This robobt has a single degree of freedom and is able to categorize object according to size.

LUCS Haptic Hand II

The second system (Johnsson & Balkenius, 2007a) was a system for haptic shape perception and used a three-fingered 8 dof robot hand, the LUCS Haptic Hand II, equipped with a wrist for horizontal rotation and a mechanism for vertical re-positioning. This robot hand was equipped with 45 piezoelectric tactile sensors. This system used active explorations of the objects by several grasps with the robot hand to gather tactile information. The LUCS Haptic Hand II was not equipped with any proprioceptive sensors. Instead it used the positioning commands to the actuators, which is less accurate than real proprioceptive sensors since the desired positions are not necessarily the same as the actual positions. Depending on the version of the system, either tensor product operations or a novel neural network, the Tensor Multiple Peak SOM (T-MPSOM) was used to code the tactile information in a useful way and a SOM finally performed the categorization. The system successfully learned to discriminate between different shapes as well as between different objects within a shape category such as a set of spheres, blocks and cylinders.

Hand 2
LUCS Haptic Hand II uses 12 degrees of freedom and 45 sensors to categorize objects according to shape.

LUCS Haptic Hand III

The current system, LUCS Haptic Hand III, is a five fingered 12 dof anthropomorphic robot hand equipped with 11 proprioceptive sensors (Johnsson & Balkenius, 2007b). The robot hand has a thumb consisting of two phalanges whereas the other fingers have three phalanges. The thumb can be separately flexed/extended in both the proximal and the distal joints and adducted/abducted. The other fingers can be separately flexed/extended in their proximal joints whereas the middle and the distal joints are flexed/extended together. All this is similar to the human hand. The wrist can also be flexed/extended as the wrist of a human hand. The phalanges are made of plastic pipe segments and the force transmission from the actuators, which are located in the forearm, are handled by tendons inside the phalanges in a similar way to the tendons of a human hand. All fingers, except the thumb, are mounted directly on the palm. The thumb is mounted on a RC servo, which controls the adduction/abduction. The RC servo is mounted on the proximal part of the palm, similar to the site of the thumb muscles in a human hand. The actuators of the fingers and the wrist are located in the forearm. This is also similar to the muscles that actuate the fingers of a human hand. The hand is actuated by in total 12 RC servos, and to get proprioceptive sensors the internal potentiometers in the RC servos, except the RC servo that actuates the wrist, have been included in the sensory circuit.

Hand 3
LUCS Haptic Hand III

The system uses a self-organizing map for the mapping of the explored objects. In our experiments the system was trained and tested with 10 different objects of different sizes from two different shape categories. To estimate the generalization ability the system was also tested with 6 new objects. The system showed good performance with the objects from both the training set as well as in the generalization experiment. In both cases the system was able to discriminate the shape, the size and to some extent the individual objects.


Johnsson, M., & Balkenius, C. (2006). Experiments with Artificial Haptic Perception in a Robotic Hand, Journal of Intelligent and Fuzzy Systems, 17, 4, 377-385.

Johnsson, M., & Balkenius, C. (2007a). Neural Network Models of Haptic Shape Perception, Journal of Robotics and Autonomous Systems, in press.

Johnsson, M., & Balkenius, C. (2007b). Experiments with Proprioception in a Self-Organizing System for Haptic Perception. To appear in Proceedings of TAROS 2007, University of Wales, Aberystwyth, UK.

blog comments powered by Disqus