Figure 3: Yaw, pitch and roll angle error.
4 CONCLUSIONS
In this paper, two inertial and vision data fusion
algorithms are proposed for attitude measurement.
Kalman Filter is employed for data fusion, and
different methods are designed to adapt the high
frequency or low frequency of visual data. In
orientation error filter, the state error and residual
are updated once there are visual measurements, and
then the residual can be used to compensate
estimated attitudes. Comparing with relative pose
filter, the state was optimal estimated every frame.
In order to validate the performance of the two
filters, the trajectory with three degrees of freedom
is designed. The experimental results show that
high-precision in comparison with visual data can be
obtained by the proposed methods. And orientation
error filter could work well with even lower
frequency of visual data, which confirm the
computational efficiency and reliability of the
proposed method.
ACKNOWLEDGEMENTS
This work is partially supported by the National
Key Research and Development Program of China
with a grant number as 2016YFB0502004 and
partially supported by the National Natural Science
Foundation of China with a grant number as
61320106010.
REFERENCES
Chai, W. A. Hoff, and T. Vincent, 2002. 3-D motion and
structure estimation using inertial sensors and
computer vision for augmented reality. In Presence-
Teleoperators and Virtual Environments.
Himberg, H., Motai, Y., Bradley, A. A., 2013. Multiple
Model Approach to Track Head Orientation With
Delta Quaternions. IEEE Transactions on Cybernetics,
43(1):90.
Carrillo, R. G., Lopez, A. E. D., Lozano, R. and Pegard,
C., 2012. Combining stereo vision and inertial
navigation system for a quad-rotor UAV. In Journal of
Intelligent & Robotic Systems, 65(1), 373–387.
Huster, A., 2003. Relative position sensing by fusing
monocular vision and inertial rate sensors, Stanford
University.
Armesto, L., Tornero, J., Vincze, M., 2007. Fast Ego-
motion Estimation with Multi-rate Fusion of Inertial
and Vision, Sage Publications, Inc.
Foxlin, E., Harrington, M., and Pfeifer, G., 1998.
Constellation: A wide-range wireless motion-tracking
system for augmented reality and virtual set
applications. 371–378.
Enayati, N., Momi, E. D. and Ferrigno, G., 2015. A
quaternion-based unscented Kalman filter for robust
optical/inertial motion tracking in computer-assisted
surgery. In IEEE Transactions on Instrumentation &
Measurement. 64(8), 2291.
Gebre-Egziabher, D., Hayward, R. C. and Powell, J. D.,
2004. Design of multisensory attitude determination
systems. In IEEE Transactions on Aerospace
Electronic Systems. 40(2):627-649.
Roetenberg, D., Luinge, H., Slycke, P., 2009. Xsens mvn:
full 6dof human motion tracking using miniature
inertial sensors. Xsens Motion Technologies Bv.
Welch, G., Foxlin, E., 2002. Motion tracking: No silver
bullet, but a respectable arsenal. In Computer
Graphics & Applications IEEE. 22(6):24-38.
Baritzhack, I. Y., Berman, N., 1988. Control theoretic
approach to inertial navigation systems. Journal of
Guidance Control & Dynamics, 10(10), 1442-1453.
yaw angle error/deg
0 100 200 300
time/s
-2
-1.5
-1
-0.5
0
0.5
1
1.5
2
Relative Pose Filter
Orientation Error Filter
0 100 200 300
time/s
-2
-1.5
-1
-0.5
0
0.5
1
1.5
2
Relative Pose Filter
Orientation Error Filter