of the focal length estimations is shown in Figure 6
(the pixel is almost the same in
u and v directions on
the image sensor).
Figure 5: Image trajectory during the autocalibration
Figure 6: Convergence of the estimated focal lengths
8 CONCLUSIONS
We proposed a new method for combining visual
and force information which allow us to update the
intrinsic parameters during the task by using an
autocalibration approach. The visual-force control
system has others original aspects which improve
the behaviour of the system. Within these aspects we
should mention the variable weights applied to each
sensor (depending on the GLR parameter) and the
possibility of managing contradictory control
actions. As the results show, the robot is able to
track the image trajectory maintaining a constant
force with the workspace using visual and force
information simultaneously.
REFERENCES
Baeten, J., De Schutter, J., 2002, Hybrid Vision/Force
Control at Corners in Planar Robotic-Contour
Following. IEEE/ASME Transactions on
Mechatronics, vol. 7, no 2, pp. 143 – 151.
Baeten, J., Bruyninckx, H., De Schutter, J, 2002. Shared
Control in Hybrid Vision/Force robotic Servoing using
the Task Frame. In Proceedings of the 2002 IEEE/RSJ
International Conference on Intelligent Robots and
Systems. Lausanne, Suiza. Pp. 2128-2133.
Bruyninckx, H., De Schutter, J., 1996. Specification of
force-controlled actions in the task frame formalism-A
synthesis, IEEE Transactions on Robotics and
Automation, vol. 12, no. 4. pp. 581-589.
Hutchinson, S., Hager, G., Corke, P., 1996. A Tutorial on
Visual Servo Control. IEEE Trans. on Robotics and
Automation, vol. 12, no. 5, pp. 651-670.
Marchand, E., Chaumette, F., 2002. Virtual Visual
Servoing: a framework for real-time augmented reality.
In EUROGRAPHICS 2002 Conference Proceeding,
Computer Graphics Forum, Sarrebruck, Germany. vol.
21, no. 3, pp. 289-298.
Mezouar, Y., Chaumette, F., 2002. Path Planning For
Robust Image-based Control. IEEE Transactions on
Robotics and Automation, Vol. 18, No. 4, pp. 534-
549.
Morel, G., Malis, E., Boudet, S., 1998. Impedance based
combination of visual and force control. In IEEE Int.
Conf. on Robotics and Automation, Leuven, Belgium,
pp. 1743-1748.
Namiki, A., Nakabo, I., Ishikawa, M., 1999. High speed
grasping using visual and force feedback. In IEEE Int.
Conf. on Robotics and Automation, Detroit, MI, pp.
3195-3200.
Olsson, T., Bengtsson, J., Johansson, R., Malm, H., 2002.
Force Control and Visual Servoing Using Planar
Surface Identification, In IEEE Int. Conf. on Robotics
and Automation. Washington, USA. Pp. 4211-4216.
Pomares, J., Torres, F., 2005. Movement-flow based
visual servoing and force control fusion for
manipulation tasks in unstructured environments. IEEE
Transactions on Systems, Man, and Cybernetics—Part
C. Vol. 35, No. 1. Pp. 4 – 15.
Tsuji, T., Hiromasa, A., Kaneko, M., 1997. Non-contact
impedance control for redundant manipulators using
visual information, In IEEE Int. Conf. on Robotics and
Automation, Albuquerque, USA. vol. 3, pp. 2571-2576.
Willsky, A. S., Jones, H. L., 1976. A generalized
likelihood ration approach to the detection and
estimation of jumps in linear systems. IEEE Trans.
Automat. Contr., vol. 21, no. 1, pp. 108-112.
Zhang, Z., 2000. A flexible new technique for camera
calibration. IEEE Transactions on Pattern Analysis and
Machine Intelligence, vol. 22, no. 11, pp. 1330-1334.
8,8
8,5
8,2
8
7,8
7,5
10 20 30 40
Iterations
mm.
v
ADAPTIVE VISUAL-FORCE CONTROL IN UNKNOWN WORKSPACES
201