Table 2: The classification accuracy of the segmented
silhouette pose for different orders of the traditional radial
Zernike implementation (Delibasis, et al. 2016).
Zernike
Degree n,
Order m
Classification
Accuracy (%)
The proposed algorithm for fall detection has
been applied to two video sequences containing 5
fall events, acquired by the fish-eye camera at 15 fps
frame rate of 480x640 pixels. The confusion matrix
for both videos is shown in Table 3.
Table 3: Confusion matrix for fall classification, from
(Delibasis and Maglogiannis, 2015).
4 CONCLUSIONS
A number of image processing and computer vision
tasks have been presented, applied to images and
videos acquired by a calibrated fisheye camera. First
we defined a metric for pixel distances, based on the
image formation model. Subsequently we applied
this metric to the definition of the Gaussian kernel,
as well as to the re-definition of Zernike Moment
Invariants (ZMI). The corrected ZMI outperformed
the traditional ones for pose recognition. Two more
applications, involving silhouette segmentation and
fall detection, the later one without the requirement
for full fisheye calibration were reviewed. All these
fisheye-specific processing tasks were applied to
spatial domain, without the need to remap the image
to different grids, or correct for the strong
distortions. These results support our position, that
efficient image processing and analysis algorithms
can be performed directly in the fish-eye image
domain. Further work includes the application of a
number of other feature extraction algorithms, such
as SIFT, Harris corner detection and Hough
Transform.
REFERENCES
Hansen, P., Corke P., Boles W. and Daniilidis, K. (2007)
Scale Invariant Feature Matching with Wide Angle
Images. In Proceedings IEEE/RSJ International
Conference on Intelligent Robots and Systems, pages
pp. 1689-1694, San Diego, USA.
Lowe D., “Distinctive image features from scale-invariant
keypoints,” International Journal of Computer Vision,
vol. 60, no. 2, pp. 91–110, 2004.
Bulow,T., “Spherical diffusion for 3D surface smoothing,”
IEEE Transactions on Pattern Analysis and Machine
Intelligence, vol. 26, no. 12, pp. 1650–1654, Dec
2004.
Hara, K., Inoue, K., and Urahama, K. (2015). Gradient
operators for feature extraction from omnidirectional
panoramic images. Pattern Recognition Letters, 54,
89-96.
Cruz-Mota, J., Bogdanova, I., Paquier, B., Bierlaire, M.,
and Thiran, J. P. (2012). Scale invariant feature
transform on the sphere: Theory and applications.
International journal of computer vision, 98(2), 217-
241.
Andreasson, H., Treptow, A., and Duckett, T. (2005).
Localization for mobile robots using panoramic vision,
local features and particle filter. In Robotics and
Automation, 2005. ICRA 2005. Proc. of the 2005 IEEE
International Conference on (pp. 3348-3353).
Zhao, Q., Feng, W., Wan, L., and Zhang, J. (2015).
SPHORB: a fast and robust binary feature on the
sphere. International Journal of Computer Vision,
113(2), 143-159.
Li H. and Hartley R., Plane-Based Calibration and Auto-
calibration of a Fish-Eye Camera, in P.J. Narayanan et
al. (Eds.): ACCV 2006, LNCS 3851, pp. 21–30, 2006,
Springer-Verlag Berlin Heidelberg 2006.
Shah S. and Aggarwal J. 1996, Intrinsic parameter calibre-
tion procedure for a high distortion fish-eye lens
camera with distortion model and accuracy estimation,
Pattern Recognition, 29(11), 1775- 1788, 1996.
Delibasis, K. K., Plagianakos, V. P. and Maglogiannis, I.
“Refinement of human silhouette segmentation in
omni-directional indoor videos,” Computer Vision and
Image Understanding, vol. 128, pp. 65-83, 2014.
Delibasis, K. K., S. V. Georgakopoulos, K. Kottari, V. P.
Plagianakos, and I. Maglogiannis. (2016)
"Geodesically-corrected Zernike descriptors for pose
recognition in omni-directional images." Integrated
Computer-Aided Engineering, Preprint (2016): 1-15.
Geyer, C., and Daniilidis, K. (2001). Catadioptric
projective geometry. International journal of
computer vision, 45(3), 223-243.
Delibasis, K. K., and Maglogiannis, I. (2015). A fall
detection algorithm for indoor video sequences
captured by fish-eye camera. In Bioinformatics and
Bioengineering (BIBE), 2015 IEEE 15th International
Conference on (pp. 1-5).
Feature Extraction and Pattern Recognition from Fisheye Images in the Spatial Domain
465