therefore it takes the sensor output 10ms to change
from the minimum of 0V to the maximum of 5V
from the point of contact with the object. The sensor
DC signal is reconstructed at 100ksps samples.
This gives 100ksps * 0.01s = 1000 signal
reconstruction points during the 10ms when the
sensor signal changes from 0 to 5V. This is
equivalent to 5N / 1000 = 5mN tactile grasping force
resolution. Note that during the fastest sensor signal
change rate, the control system can only distinguish
the grasping force at a resolution of 5mN despite the
fact that the A/D converter can provide a resolution
of 1.2mN.
To allow the prototype to manipulate heavier
objects, such as the plastic drink bottle described
earlier, the “bulk” grasping force sensing was
designed for a maximum of 20N.
The same analysis as for the 0-5N range tactile
sensor is applicable to the 0-20N range force sensor.
A 12 bit converter was selected, which provides a
signal conversion resolution of 20N / 4095 = 4.9mN.
The force sensor outputs 0-5V when the finger is
deflected 4mm at the tip (finger has compliant
joints). The grasping mechanism can move with a
maximum linear velocity of 100mm/s, therefore it
takes the sensor output 40ms to change from the
minimum of 0V to the maximum of 5V from the
point of contact with the object. The sensor DC
signal is reconstructed at 100ksps, which gives
100ksps * 0.01s = 1000 signal reconstruction points
during the 40ms when the sensor signal changes
from 0 to 5V. This is equivalent to 20N / 1000 =
20mN grasping force resolution. During the fastest
sensor signal change rate, the control system can
only distinguish the grasping force at a resolution of
20mN despite the fact that the A/D converter can
provide a resolution of 4.9mN.
The position sensors output 0 to 5V per 90
degrees of rotation. Using a 12 bit A/D converter
gives a resolution of 90 deg / 4095 = 0.022 degrees.
If a resolution of 0.03 degrees is used, the resolution
at the end of a robot arm with two links of 300mm
each is 1mm, which is adequate for reliable arm,
hand and finger positioning during object
manipulation.
Object handling experimentation shows that the
choice of sensor signal feedback resolution and
update frequency is adequate for reliable control of
fragile object grasping and safe manipulation.
5 CONCLUSIONS
The main objective in this project is to develop
object grasping and manipulation with collision
detection and slippage control to allow the robot to
manipulate objects safely with useful dexterity. The
most challenging part in the project is to achieve
useful feedback information from sensors.
Currently the experimental tactile slippage
sensing approach is based on a parallel-jaw gripper
that provides acceptable slippage detection but only
in one axis (object rotation in the gripper is only
detected by vision). Further work is needed to
develop a robust slippage sensing strategy that
would provide reliable slippage status feedback to
help detect and prevent slippage in all axes.
Further work is needed to add advanced
functionality to the sensor fusion module. The
learning module also needs further work to improve
its usefulness.
REFERENCES
Namiki, A., Ishikawa, M., 1996, Optimal Grasping using
Visual and Tactile Feedback, Proceedings of the
IEEE/SICS/RSJ International Conference on
Multisensor Fusion and Integration for Intelligent
Systems, Washington DC, USA, December.
Allen, P. K., Miller, A., Oh, P. Y., Leibowitz, B., 1999,
Integration of vision, force and tactile sensing for
grasping, International Journal of Intelligent
Machines, 4(1):129-149.
Boshra, M., Zhang, H., March 2000, Localizing a
polyhedral object in a robot hand by integrating visual
and tactile data, PR(33), No. 3, pp. 483-501.
Pelossof, R., Miller, A., Allen, P., Jebara, T., 2004, An
SVM Learning Approach to Robotic Grasping,
Proceedings of International Conference on Robotics
and Automation (ICRA’04), vol. 4, pp. 3512 -3518.
Khalil, F. F., Payeur, P., 2007, Robotic Interaction with
Deformable Objects under Vision and Tactile
Guidance - a Review, International Workshop on
Robotic and Sensors Environments, pp.1-6, 12-13 Oct.
Prats, M., Sanz, P. J., del Pobil, P. A., May 2009, Vision-
tactile-force integration and robot physical interaction,
Conference on Robotics and Automation, 2009. ICRA
'09. IEEE International, pp.3975-3980.
Dzitac, P., Mazid, M. A., 2011, An Innovative Approach
for Grasping Force Estimation Using Tactile and
Vision Sensors, 15th International Conference on
Mechatronics Technology (ICMT). November 30 to
December 2, Melbourne, Australia.
Smith, S., 1999, The Scientist and Engineer's Guide to
Digital Signal Processing 2nd Ed., California
Technical Publishing, San Diego, California.
ICINCO2012-9thInternationalConferenceonInformaticsinControl,AutomationandRobotics
204