An Image Impairment Assessment Procedure using the Saliency Map Technique

Hayato Teranaka, Minoru Nakayama

2016

Abstract

An automated mechanical assessment procedure is required to evaluate image quality and impairment. This paper proposes a procedure for image impairment assessment using visual attention, such as saliency maps of the impaired images. To evaluate the performance of this image assessment procedure, an experiment was conducted to study viewer’s subjective evaluations of impaired images, and the relationships between viewer’s ratings and a previously developed set of values were then analyzed. Also, the limitations of the procedure which was developed were discussed in order to improve assessment performance. The use of image features and frequency-domain representation values for the test images was proposed.

References

  1. Engelke, U., Kaprykowsky, H., Zepernick, H., and JdjikiNya., P. (2011). Visual attention in quality assessment. IEEE Signal Processing Magazine, 50.
  2. Guraya, F. F. E., Cheikh, F. A., Trémeau, A., Tong, Y., and Konik, H. (2010). Predictive Saliency Maps for Surveillance Videos. In Proc. of 2010 Ninth International Symposium on Distributed Computing and Applications to Business, Engineering and Science, pages 508-513.
  3. Itti, L. (2005). Quantifying the contribution of low-level saliency to human eye movements in dynamic scenes. Visual Cognition, 12(6):1093-1123.
  4. Itti, L., Koch, C., and Niebur, E. (1998). A model of saliency-based visual attention for rapid scene analysis. IEEE Trans. Pattern Analysis and Machine Intelligence, 20(11):1254-1259.
  5. Jung, C. (2014). Hybrid integration of visual attention model into image quality metric. IEICE Trans. INF. & SYST., E97-D(11):2971-2973.
  6. Liu, H. and Heynderickx, I. (2011). Visual attention in objective image quality assessment: Based on eyetracking data. IEEE Trans. Circuits and Systems for Video Technology, 21(7):971-982.
  7. Motoyoshi, I., Nishida, S., Sharan, L., and Adelson, E. H. (2007). Image statistics and the perception of surface qualities. Nature, 447:206-209.
  8. Tong, Y.-B., Chang, Q., and Zhang, Q.-S. (2006). Image Quality Assessing by using NN and SVM. In Proc. of Fifth International Conference on Machine Learning and Cybernetics, pages 3987-3990.
  9. Walter, D. and Koch, C. (2006). Modeling attention to salient proto-objects. Neural Networks, 19:1395-1407. Software available at http://www.saliencytoolbox.net.
  10. Yu-Bing, T., Konik, H., Cheikh, F. A., and Tremeau, A. (2010). Full reference image quality assessment based on saliency map analysis. International Jounal of Imaging Science and Technology, 54(3):030503- 030514.
  11. YuBing, T., Cheikh, F. A., Guraya, F. F. E., Konik, H., and Trémeau, A. (2011). A spatiotemporal saliency model for video surveillance. Journal of Cognitive Computing, 3(1):241-263.
Download


Paper Citation


in Harvard Style

Teranaka H. and Nakayama M. (2016). An Image Impairment Assessment Procedure using the Saliency Map Technique . In Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 4: VISAPP, (VISIGRAPP 2016) ISBN 978-989-758-175-5, pages 73-78. DOI: 10.5220/0005638600730078


in Bibtex Style

@conference{visapp16,
author={Hayato Teranaka and Minoru Nakayama},
title={An Image Impairment Assessment Procedure using the Saliency Map Technique},
booktitle={Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 4: VISAPP, (VISIGRAPP 2016)},
year={2016},
pages={73-78},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005638600730078},
isbn={978-989-758-175-5},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 4: VISAPP, (VISIGRAPP 2016)
TI - An Image Impairment Assessment Procedure using the Saliency Map Technique
SN - 978-989-758-175-5
AU - Teranaka H.
AU - Nakayama M.
PY - 2016
SP - 73
EP - 78
DO - 10.5220/0005638600730078