of X and N is the sample size. When the sample size
is fairly large we can replace S with
ˆ
S (Anderson,
1954). As a result, a new observation X
test
is classi-
fied to class j whenever:
χ
2
p
(1−α/2) ≤ N
¯
x
test
ˆ
S
−1
j
¯
x
T
test
≤ χ
2
p
(α/2) (19)
where χ
2
p
(α) is given by:
P
χ
2
p
> χ
2
p
(α)
= α
In Eq.(19) the significant level is set to be α = 0.05.
At the same time we also compare the result of dis-
tance to the center of mass in classification with the
result of Eq.(19).
Table 1: Probability of correct classification within three
classes C
1
,C
2
and C
3
in comparison to the resulting classi-
fier using Eq.(19).
Accuracy
Class 1 (%)
Accuracy of
Class 2 (%)
Accuracy
for Class 3 (%)
d
R3
0.92 0.86 0.95
d
F
0.83 0.39 0.62
d
R1
0.91 0.51 0.78
d
R2
0.92 0.51 0.82
Eq.(19) 0.80 0.60 0.72
In addition we applied our technique to the clas-
sification of human breast cancer cells undergoing
treatment of different drugs. As explained before our
technique is based on classification in clusters based
on the covariance estimate distances rather than cen-
ter of the class (corresponding to the mean of the data
point cloud). The original data set consisted of 11
different labels corresponding to 11 different treat-
ments. Each label consisted of 382 wells which were
imaged using Perkin Elmar high content imaging sys-
tem. Our preliminary results indicate that our aver-
age classification error is approximately 13% when
the half of the cells are used for training. The pre-
liminary comparison with commonly used clustering
techniques based on the sample average (mean) indi-
cate that our performance is significantly better (5%)
however it may be due to the large training set.
5 CONCLUSIONS
In this paper we proposed a new technique to for es-
timating positive definite matrices in the presence of
uncertainty. Unlike commonly used techniques our
method uses Frechet mean which implicitly accounts
for the positive definite structure of the covariance
matrix which is ignored in commonly used estima-
tors which do not exploit geometric constraint given
by positive definite property. We demonstrate the cal-
culation of the proposed mean using three different
distance measures which may be better choice in dif-
ferent applications. We demonstrated the applicabil-
ity and performance of our techniques on a simulated
data set and established that in the preliminary anal-
ysis the results look promising for high content cell
imaging classification problem.
REFERENCES
Absil, P.-A., Mahony, R., and Sepulchre, R. (2009). Opti-
mization algorithms on matrix manifolds. Princeton
University Press.
Anderson, T. W. (1954). An Introduction To Multivariate
Statistical Analysis. Wiley Eastern Private Limited;
New Delhi.
Barachant, A., Bonnet, S., Congedo, M., and Jutten, C.
(2010). Riemannian geometry applied to BCI classi-
fication. In Latent Variable Analysis and Signal Sepa-
ration, pages 629–636. Springer.
Barbaresco, F. (2008). Innovative tools for radar signal pro-
cessing based on Cartans geometry of SPD matrices
& information geometry. Radar Conference, 2008.
RADAR’08. IEEE, pages 1–6.
Crosilla, F. and Beinat, A. (2002). Use of generalised pro-
crustes analysis for the photogrammetric block adjust-
ment by independent models. ISPRS Journal of Pho-
togrammetry and Remote Sensing, 56(3):195–209.
Jeuris, B., Vandebril, R., and Vandereycken, B. (2012). A
survey and comparison of contemporary algorithms
for computing the matrix geometric mean. Electronic
Transactions on Numerical Analysis, 39:379–402.
Johnson, R. A. and Wichern, D. W. (2002). Applied mul-
tivariate statistical analysis, volume 5. Prentice Hall
Upper Saddle River, NJ.
Li, Y. and Wong, K. M. (2013). Riemannian distances for
EEG signal classification by power spectral density.
IEEE journal of selected selected topics in signal pro-
cessing.
MacKay, D. J. (1998). Introduction to Monte Carlo meth-
ods. In Learning in graphical models, pages 175–204.
Springer.
Mardia, K. V., Kent, J. T., and Bibby, J. M. (1979). Multi-
variate analysis. Academic Press.
Moakher, M. (2005). A differential geometric approach
to the geometric mean of symmetric positive-definite
matrices. SIAM Journal on Matrix Analysis and Ap-
plications, 26(3):735–747.
Pigoli, D., Aston, J. A., Dryden, I. L., and Secchi, P.
(2014). Distances and inference for covariance op-
erators. Biometrika.
EstimatingPositiveDefiniteMatricesusingFrechetMean
299