tails, for instance, on the hat. An intermediate value of
k = 40 is a good compromise between the quality of
the restored image and the processing time. In terms
of speed, our algorithm is much faster than UINTA,
due to the adaptive weighted kNN framework. In-
deed, UINTA have to update the Parzen window size
at each iteration. To do this a cross validation opti-
mization is performed. On the contrary our method
simply adapts the PDF changes during the minimiza-
tion process. For instance, cpu time in the Matlab
environment for standard UINTA algorithm is almost
4500 sec. Our algorithm, with k = 10, only requires
around 600 sec.
Table 1: RMSE and SSIM values for different values of k,
and for UINTA.
k RMSE SSIM
3 5.511 0.918
5 4.219 0.917
10 4.012 0.910
15 4.061 0.907
20 4.142 0.905
30 4.347 0.895
40 4.498 0.889
50 4.642 0.882
80 4.960 0.867
100 5.117 0.832
200 5.541 0.805
UINTA 4.651 0.890
5 CONCLUSIONS
This paper presents a restoration method in the vari-
ational framework based on the minimization of the
conditional entropy using the kNN framework. In par-
ticular a new adapted weighted kNN (AWkNN) ap-
proach has been proposed. The simulations indicated
slightly better results in RMSE and SSIM measures
w.r.t. the UINTA algorithm and a marked gain in cpu
speed. This gain is due to the property of AWkNN
that simply adapts the PDF changes during the min-
imization process. Results are even more promising
considering that no regularization is applied.
As future works, a regularization method will be
taken into account. Moreover, the feature space di-
mension can be boosted by taking into account other
image related features. For instance, the image gradi-
ent components can be used as additional features.
REFERENCES
Ahmad, I. A. and Lin, P. (1976). A nonparametric estima-
tion of the entropy for absolutely continuous distribu-
tions. IEEE Transactions On Information Theory.
Awate, S. P. and Whitaker, R. T. (2006). Unsupervised,
information-theoretic, adaptive image filtering for im-
age restoration. IEEE Trans. Pattern Anal. Mach. In-
tell., 28(3):364–376.
Boltz, S., Debreuve, E., and Barlaud, M. (2007). High-
dimensional statistical distance for region-of-interest
tracking: Application to combining a soft geomet-
ric constraint with radiometry. In IEEE International
Conference on Computer Vision and Pattern Recogni-
tion, Minneapolis, USA. CVPR’07.
Carlsson, G., Ishkhanov, T., de Silva, V., and Zomorodian,
A. (2007). On the local behavior of spaces of natural
images. International Journal of Computer Vision.
Comaniciu, D. and Meer, P. (May, 2002). Mean shift: A
robust approach toward feature space analysis. IEEE
Transactions On Pattern Analysis And Machine Intel-
ligence, 24, NO.5:603–619.
Cover, T. and Thomas, J. (1991). Elements of Information
Theory. Wiley-Interscience.
Dudani, S. (1976). The distance-weighted k-nearest-
neighbor rule. 6(4):325–327.
Elgammal, A., Duraiswami, R., and Davis, L. S. (2003).
Probabilistic tracking in joint feature-spatial spaces.
pages 781–788, Madison, WI.
Fukunaga, K. and Hostetler, L. D. (January, 1975). The es-
timation of the gradient of a density function, with ap-
plications in pattern recognition. IEEE Transactions
On Information Theory, 21, NO.1:32–40.
Geman, S. and Geman, D. (1990). Stochastic relaxation,
gibbs distributions, and the bayesian restoration of im-
ages. pages 452–472.
Huang, J. and Mumford, D. (1999). Statistics of natural
images and models. pages 541–547.
Lee, A. B., Pedersen, K. S., and Mumford, D. (2003). The
nonlinear statistics of high-contrast patches in natural
images. Int. J. Comput. Vision, 54(1-3):83–103.
Mount, D. M. and Arya, S. Ann: A library
for approximate nearest neighbor searching.
http://www.cs.umd.edu/ mount/ANN/.
Scott, D. (1992). Multivariate Density Estimation: Theory,
Practice, and Visualization. Wiley.
Wang, Z., Bovik, A. C., Sheikh, H. R., and Simoncelli, E. P.
(APRIL, 2004). Image quality assessment: From error
visibility to structural similarity. IEEE Transactions
On Image Processing, 13, NO.4:600–612.
APPENDIX
In this section the calculus of the derivativeof the con-
ditional entropy of eq.(3) is performed. Let us remind
A MINIMUM ENTROPY IMAGE DENOISING ALGORITHM - Minimizing Conditional Entropy in a New Adaptive
Weighted K-th Nearest Neighbor Framework for Image Denoising
581