Figure 5: Projected model on the consecutive frames with
large camera motion (Distance between the target on each
frame are approximately 15-20pixels).
Figure 6: Recovery from incorrect pose (Left : Incorrectly
estimated pose, Right : Correctly estimated pose on three
frames later from the left image).
7 CONCLUSIONS
In this paper, we proposed a 3D tracking method
which integrates the 2D feature tracking. By track-
ing edges and holding 2D-3D correspondences, our
tracker can handle large camera motions and can re-
cover to the correct pose even once the pose estima-
tion fails. Moreover, our tracker estimates the pose
from both 2D-3D line segment correspondences and
motions of feature points. By fusing those two kinds
of information, the tracker can suppress the influ-
ence of the incorrect correspondence and can track
even when the sufficient number of 2D-3D correspon-
dences are not obtained. We also proposed automatic
camera pose and 2D-3D correspondences estimation
method and succeeded to estimate the pose and corre-
spondences on the initial frame automatically. From
the experiments, we confirmed our tracker can track
in real-time with noizy low resolution images taken
by a cheap USB camera.
As the future work, we intend to measure the 3D
position of feature points appeared during the tracking
from their 2D positions and estimated poses on a few
frame, then, continue 2D tracking for them and use
their 2D-3D correspondences on the latter frame of
the 3D tracking.
REFERENCES
Besl, P. J. and McKay, N. D. (1992). A Method for Reg-
istration of 3-D Shapes. IEEE Trans. Pattern Anal.
Mach. Intell., 14(2):239–256.
Canny, J. (1986). A computational approach to edge de-
tection. IEEE Trans. Pattern Anal. Mach. Intell.,
8(6):679–698.
Chiba, N. and Kanade, T. (1998). A Tracker for Broken and
Closely Spaced Lines. In ISPRS ’98, volume XXXII,
pages 676 – 683.
Christy, S. and Horaud, R. (1999). Iterative Pose Compu-
tation from Line Correspondences. CVIU, 73(1):137–
144.
Comport, A., Marchand, E., and Chaumette, F. (2003). A
real-time tracker for markerless augmented reality. In
ISMAR’03, pages 36–45.
Drummond, T. and Cipolla, R. (2002). Real-Time Visual
Tracking of Complex Structures. IEEE Trans. Pattern
Anal. Mach. Intell., 24(7):932–946.
G.A.F. Seber, C. (1981). Nonlinear Regression, chapter 14.
Wiley.
Hartley, R. I. and Zisserman, A. (2000). Multiple View
Geometry in Computer Vision. Cambridge University
Press.
L. Vacchetti, V. L. and Fua, P. (2004a). Combining Edge
and Texture Information for Real-Time Accurate 3D
Camera Tracking. In ISMAR’04, pages 48–57.
L. Vacchetti, V. L. and Fua, P. (2004b). Stable Real-Time
3D Tracking Using Online and Offline Information.
IEEE Trans. Pattern Anal. Mach. Intell., 26(10):1385–
1391.
Liu, Y., Huang, T. S., and Faugeras, O. D. (1990). Determi-
nation of Camera Location from 2-D to 3-D Line and
Point Correspondences. IEEE Trans. Pattern Anal.
Mach. Intell., 12(1):28–37.
Lowe, D. G. (1987). Three-Dimensional Object Recogni-
tion from Single Two-Dimensional Images. Artificial
Intelligence, 31(3):355–395.
Lowe, D. G. (1992). Robust model-based motion track-
ing through the integration of search and estimation.
IJCV, 8(2):113–122.
Lucas, B. and Kanade, T. (1981). An Iterative Image Reg-
istration Technique with an Application to Stereo Vi-
sion. In IJCAI’81, pages 674–679.
Phong, T., Horaud, R., Yassine, A., and Tao, P. (1995). Ob-
ject Pose from 2-D to 3-D Point and Line Correspon-
dences. IJCV, 15(3):225–243.
Powell, M. J. D. (1970). A Hybrid Method for Non-linear
Equations. In Rabinowitz, P., editor, Numerical Meth-
ods for Non-linear Equations, pages 87–114. Gordon
and Breach.
Shi, J. and Tomasi, C. (1994). Good Features to Track. In
CVPR’94.
3D TRACKING USING 2D-3D LINE SEGMENT CORRESPONDENCE AND 2D POINT MOTION
285