described previously, and, on the other hand, the
training port available on a 14SG RC from Futaba
which enables the PC to manage all the drone
controls. The connection with RC is not direct, and it
is based on an Arduino board which works as a
bridge translating commands from USB to PWM
signals required by the training port of the RC. The
ground station also provides real-time information to
the human operator about the trajectories and tasks
that the drone is performing, while allows operator
to define paths and tasks, or even take the control of
the drone for a manual piloting.
Figure 3: (a) Allegro hand with the tactile sensors installed
in the fingertips’ surface. (b) Allegro hand grasping the
object to be manipulated. (c) Different 3D representation
of the pressure measurements registered by the arrays of
tactile sensors.
2.2 The Allegro Robotic Hand
The robotic manipulation system is based on the
Allegro robotic hand (Wonik Robotics Seoul, Korea)
(see Figure 3). This hand has four fingers and
sixteen independent torque-controlled joints (four
dof per each finger). This robotic hand has a
lightweight and portable anthropomorphic design
very suitable for low-cost dexterous manipulation in
research. The hand weight is 1.09 kg. It is capable of
holding up to 5 kg and it has support for real-time
control and online simulation. In addition, a set of
tactile sensors is employed as additional tool in the
manipulation system (see Figure 3a). These sensors
are installed in an extrinsic configuration (Tegin,
2005) on the Allegro hand. The sensors are located
at the three hand fingertips that will be used during
the manipulation (furthermore, only the three last
degrees of freedom of each finger will be controlled,
resulting in a non-redundant system). The tactile
sensors are pressure sensing arrays, type PPS
RoboTouch (Pressure Profile Systems, Inc., Los
Angeles, CA, USA), which can register pressure
values in the range 0–140 kPa with a frequency of
30 Hz and a sensitivity of 0.7 kPa. Figure 3c shows a
3D representation of the pressure measurements
registered by these sensors during a manipulation
task. The force exerted by the fingertip is computed
using the pressure measurements of the tactile
sensors. These measurements are multiplied by the
area of each sensor, 25 mm
2
, so the forces are
obtained. The mean of these forces is considered as
the force applied by the fingertip. Moreover, the
contact points are supposed to be in the sensor of
maximum pressure exerted.
3 GRASP PLANNER
This section presents the general structure of the
manipulation planner. This manipulation planner
uses the geometric model of the object and the
fingers in order to determinate the contacts between
the fingers of the robotic hand and the manipulated
object.
3.1 Kinematics Formulation
We consider the robotic hand as a set of k fingers
with three degrees of freedom. Each finger holds an
object considering contact points with friction and
without slippage. In order to firmly grasp and
manipulate the object, the grasp is considered to be
an active form closure. Thus, each fingertip i is
exerting a fingertip force f
Ci
∈ 3 within the friction
cone at the contact point. The grasping constrain
between the robot and the object is done by the
grasp matrix J
G
=[J
T
G1
…J
T
Gk
]
T
∈R
3k×6
(Murray, 1994)
which relates the contact forces f
C
= [f
Cl
T
… f
Ck
T
]
T
at the fingertips to the resultant force and moment τ
o
∈
6
on the object:
τ
o
=J
G
⋅f
C
(1)
where f
C
and τ
o
are both expressed in the object
coordinate frame S
0
fixed to the object mass center.
This equation derives the kinematics relation
between velocity of the object ẋ
o
∈
6
and velocity
of the contact point v
Ci
∈
6
:
v
Ci
=J
Gi
⋅ ẋ
o
(2)