A New Family of Bounded Divergence Measures and Application to Signal Detection
Shivakumar Jolad, Ahmed Roman, Mahesh C. Shastry, Mihir Gadgil, Ayanendranath Basu
2016
Abstract
We introduce a new one-parameter family of divergence measures, called bounded Bhattacharyya distance (BBD) measures, for quantifying the dissimilarity between probability distributions. These measures are bounded, symmetric and positive semi-definite and do not require absolute continuity. In the asymptotic limit, BBD measure approaches the squared Hellinger distance. A generalized BBD measure for multiple distributions is also introduced. We prove an extension of a theorem of Bradt and Karlin for BBD relating Bayes error probability and divergence ranking. We show that BBD belongs to the class of generalized Csiszar f-divergence and derive some properties such as curvature and relation to Fisher Information. For distributions with vector valued parameters, the curvature matrix is related to the Fisher-Rao metric. We derive certain inequalities between BBD and well known measures such as Hellinger and Jensen-Shannon divergence. We also derive bounds on the Bayesian error probability. We give an application of these measures to the problem of signal detection where we compare two monochromatic signals buried in white noise and differing in frequency and amplitude.
References
- Ali, S. M. and Silvey, S. D. (1966). A general class of coefficients of divergence of one distribution from another. Journal of the Royal Statistical Society. Series B (Methodological), 28(1):131-142.
- Basseville, M. (1989). Distance measures for signal processing and pattern recognition. Signal processing, 18:349-369.
- Basu, A. and Lindsay, B. G. (1994). Minimum disparity estimation for continuous models: efficiency, distributions and robustness. Annals of the Institute of Statistical Mathematics, 46(4):683-705.
- Bhattacharyya, A. (1946). On a measure of divergence between two multinomial populations. Sankhya˜: The Indian Journal of Statistics (1933-1960), 7(4):401-406.
- Blackwell, D. (1951). Comparison of experiments. In Second Berkeley Symposium on Mathematical Statistics and Probability, volume 1, pages 93-102.
- Bradt, R. and Karlin, S. (1956). On the design and comparison of certain dichotomous experiments. The Annals of mathematical statistics, pages 390-409.
- Brody, D. C. and Hughston, L. P. (1998). Statistical geometry in quantum mechanics. Proceedings of the Royal Society of London. Series A: Mathematical, Physical and Engineering Sciences, 454(1977):2445-2475.
- Budzy Áski, R. J., Kondracki, W., and Kr ólak, A. (2008). Applications of distance between probability distributions to gravitational wave data analysis. Classical and Quantum Gravity, 25(1):015005.
- Burbea, J. and Rao, C. R. (1982). On the convexity of some divergence measures based on entropy functions. IEEE Transactions on Information Theory, 28(3):489 - 495.
- Chernoff, H. (1952). A measure of asymptotic efficiency for tests of a hypothesis based on the sum of observations. The Annals of Mathematical Statistics, 23(4):pp. 493- 507.
- Choi, E. and Lee, C. (2003). Feature extraction based on the Bhattacharyya distance. Pattern Recognition, 36(8):1703-1709.
- Csiszar, I. (1967). Information-type distance measures and indirect observations. Stud. Sci. Math. Hungar, 2:299-318.
- Csiszar, I. (1975). I-divergence geometry of probability distributions and minimization problems. The Annals of Probability, 3(1):pp. 146-158.
- DasGupta, A. (2011). Probability for Statistics and Machine Learning. Springer Texts in Statistics. Springer New York.
- Finn, L. S. (1992). Detection, measurement, and gravitational radiation. Physical Review D, 46(12):5236.
- Gibbs, A. and Su, F. (2002). On choosing and bounding probability metrics. International Statistical Review, 70(3):419-435.
- Hellinger, E. (1909). Neue begr ündung der theorie quadratischer formen von unendlichvielen veränderlichen. Journal für die reine und angewandte Mathematik (Crelle's Journal), (136):210-271.
- Hellman, M. E. and Raviv, J. (1970). Probability of Error, Equivocation, and the Chernoff Bound. IEEE Transactions on Information Theory, 16(4):368-372.
- Jaranowski, P. and Kr ólak, A. (2007). Gravitational-wave data analysis. formalism and sample applications: the gaussian case. arXiv preprint arXiv:0711.1115.
- Kadota, T. and Shepp, L. (1967). On the best finite set of linear observables for discriminating two gaussian signals. IEEE Transactions on Information Theory, 13(2):278-284.
- Kailath, T. (1967). The Divergence and Bhattacharyya Distance Measures in Signal Selection. IEEE Transactions on Communications, 15(1):52-60.
- Kakutani, S. (1948). On equivalence of infinite product measures. The Annals of Mathematics, 49(1):214- 224.
- Kapur, J. (1984). A comparative assessment of various measures of directed divergence. Advances in Management Studies, 3(1):1-16.
- Kullback, S. (1968). Information theory and statistics. New York: Dover, 1968, 2nd ed., 1.
- Kullback, S. and Leibler, R. A. (1951). On information and sufficiency. The Annals of Mathematical Statistics, 22(1):pp. 79-86.
- Kumar, U., Kumar, V., and Kapur, J. N. (1986). Some normalized measures of directed divergence. International Journal of General Systems, 13(1):5-16.
- Lamberti, P. W., Majtey, A. P., Borras, A., Casas, M., and Plastino, A. (2008). Metric character of the quantum Jensen-Shannon divergence . Physical Review A, 77:052311.
- Lee, Y.-T. (1991). Information-theoretic distortion measures for speech recognition. Signal Processing, IEEE Transactions on, 39(2):330-335.
- Lin, J. (1991). Divergence measures based on the shannon entropy. IEEE Transactions on Information Theory, 37(1):145 -151.
- Matusita, K. (1967). On the notion of affinity of several distributions and some of its applications. Annals of the Institute of Statistical Mathematics, 19(1):181-192.
- Maybank, S. J. (2004). Detection of image structures using the fisher information and the rao metric. IEEE Transactions on Pattern Analysis and Machine Intelligence, 26(12):1579-1589.
- Nielsen, F. and Boltz, S. (2011). The burbea-rao and bhattacharyya centroids. IEEE Transactions on Information Theory, 57(8):5455-5466.
- Nielsen, M. and Chuang, I. (2000). Quantum computation and information. Cambridge University Press, Cambridge, UK, 3(8):9.
- Peter, A. and Rangarajan, A. (2006). Shape analysis using the fisher-rao riemannian metric: Unifying shape representation and deformation. In Biomedical Imaging: Nano to Macro, 2006. 3rd IEEE International Symposium on, pages 1164-1167. IEEE.
- Poor, H. V. (1994). An introduction to signal detection and estimation. Springer.
- Qiao, Y. and Minematsu, N. (2010). A study on invariance of-divergence and its application to speech recognition. Signal Processing, IEEE Transactions on, 58(7):3884-3890.
- Quevedo, H. (2008). Geometrothermodynamics of black holes. General Relativity and Gravitation, 40(5):971- 984.
- Rao, C. (1982a). Diversity: Its measurement, decomposition, apportionment and analysis. Sankhya: The Indian Journal of Statistics, Series A, pages 1-22.
- Rao, C. R. (1945). Information and the accuracy attainable in the estimation of statistical parameters. Bull. Calcutta Math. Soc., 37:81-91.
- Rao, C. R. (1982b). Diversity and dissimilarity coefficients: A unified approach. Theoretical Population Biology, 21(1):24 - 43.
- Rao, C. R. (1987). Differential metrics in probability spaces. Differential geometry in statistical inference, 10:217-240.
- Royden, H. (1986). Real analysis. Macmillan Publishing Company, New York.
- Sunmola, F. T. (2013). Optimising learning with transferable prior information. PhD thesis, University of Birmingham.
- Toussaint, G. T. (1974). Some properties of matusita's measure of affinity of several distributions. Annals of the Institute of Statistical Mathematics, 26(1):389-394.
- Toussaint, G. T. (1975). Sharper lower bounds for discrimination information in terms of variation (corresp.). Information Theory, IEEE Transactions on, 21(1):99- 100.
- Toussaint, G. T. (1977). An upper bound on the probability of misclassification in terms of the affinity. Proceedings of the IEEE, 65(2):275-276.
- Toussaint, G. T. (1978). Probability of error, expected divergence and the affinity of several distributions. IEEE Transactions on Systems, Man and Cybernetics, 8(6):482-485.
- Tumer, K. and Ghosh, J. (1996). Estimating the Bayes error rate through classifier combining. Proceedings of 13th International Conference on Pattern Recognition, pages 695-699.
- Varshney, K. R. (2011). Bayes risk error is a bregman divergence. IEEE Transactions on Signal Processing, 59(9):4470-4472.
- Varshney, K. R. and Varshney, L. R. (2008). Quantization of prior probabilities for hypothesis testing. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 56(10):4553.
- 2(?p + ?q)
Paper Citation
in Harvard Style
Jolad S., Roman A., Shastry M., Gadgil M. and Basu A. (2016). A New Family of Bounded Divergence Measures and Application to Signal Detection . In Proceedings of the 5th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM, ISBN 978-989-758-173-1, pages 72-83. DOI: 10.5220/0005695200720083
in Bibtex Style
@conference{icpram16,
author={Shivakumar Jolad and Ahmed Roman and Mahesh C. Shastry and Mihir Gadgil and Ayanendranath Basu},
title={A New Family of Bounded Divergence Measures and Application to Signal Detection},
booktitle={Proceedings of the 5th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM,},
year={2016},
pages={72-83},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005695200720083},
isbn={978-989-758-173-1},
}
in EndNote Style
TY - CONF
JO - Proceedings of the 5th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM,
TI - A New Family of Bounded Divergence Measures and Application to Signal Detection
SN - 978-989-758-173-1
AU - Jolad S.
AU - Roman A.
AU - Shastry M.
AU - Gadgil M.
AU - Basu A.
PY - 2016
SP - 72
EP - 83
DO - 10.5220/0005695200720083