Explicit Image Quality Detection Rules for Functional Safety in Computer Vision
Johann Thor Mogensen Ingibergsson, Dirk Kraft, Ulrik Pagh Schultz
2017
Abstract
Computer vision has applications in a wide range of areas from surveillance to safety-critical control of autonomous robots. Despite the potentially critical nature of the applications and a continuous progress, the focus on safety in relation to compliance with standards has been limited. As an example, field robots are typically dependent on a reliable perception system to sense and react to a highly dynamic environment. The perception system thus introduces significant complexity into the safety-critical path of the robotic system. This complexity is often argued to increase safety by improving performance; however, the safety claims are not supported by compliance with any standards. In this paper, we present rules that enable low-level detection of quality problems in images and demonstrate their applicability on an agricultural image database. We hypothesise that low-level and primitive image analysis driven by explicit rules facilitates complying with safety standards, which improves the real-world applicability of existing proposed solutions. The rules are simple independent image analysis operations focused on determining the quality and usability of an image.
References
- Adam, M., Larsen, M., Jensen, K., and Schultz, U. (2016). Rule-based dynamic safety monitoring for mobile robots. Journal of Software Engineering for Robotics, 7(1):121-141.
- Bansal, A., Farhadi, A., and Parikh, D. (2014). Towards transparent systems: Semantic characterization of failure modes. In Computer Vision-ECCV 2014, pages 366-381. Springer.
- Barry, A. J., Majumdar, A., and Tedrake, R. (2012). Safety verification of reactive controllers for uav flight in cluttered environments using barrier certificates. In International Conference on Robotics and Automation (ICRA), pages 484-490. IEEE.
- Bensalem, S., da Silva, L., Gallien, M., Ingrand, F., and Yan, R. (2010). Verifiable and correct-by-construction controller for robots in human environments. In seventh IARP workshop on technical challenges for dependable robots in human environments (DRHE).
- Blas, M. R. and Blanke, M. (2011). Stereo vision with texture learning for fault-tolerant automatic baling. Computers and electronics in agriculture, 75(1):159-168.
- Carlson, J., Murphy, R. R., and Nelson, A. (2004). Followup analysis of mobile robot failures. In International Conference on Robotics and Automation (ICRA), volume 5, pages 4987-4994. IEEE.
- Cheng, H. (2011). The State-of-the-Art in the USA. In Autonomous Intelligent Vehicles, pages 13-22. Springer.
- Daigle, M. J., Koutsoukos, X. D., and Biswas, G. (2007). Distributed diagnosis in formations of mobile robots. IEEE Transactions on Robotics, 23(2):353-369.
- Davis, J. and Goadrich, M. (2006). The relationship between precision-recall and roc curves. In Proceedings of the 23rd international conference on Machine learning, pages 233-240.
- De Cabrol, A., Garcia, T., Bonnin, P., and Chetto, M. (2008). A concept of dynamically reconfigurable realtime vision system for autonomous mobile robotics. International Journal of Automation and Computing, 5(2):174-184.
- Dollár, P., Belongie, S., and Perona, P. (2010). The fastest pedestrian detector in the west. In Proc. BMVC, pages 68.1-11.
- Gupta, P., Loparo, K., Mackall, D., Schumann, J., and Soares, F. (2004). Verification and validation methodology of real-time adaptive neural networks for aerospace applications. In International Conference on Computational Intelligence for Modeling, Control, and Automation.
- Ingibergsson, J. T. M., Schultz, U. P., and Kuhrmann, M. (2015). On the use of safety certification practices in autonomous field robot software development: A systematic mapping study. In Product-Focused Software Process Improvement, pages 335-352. Springer.
- Kurd, Z. and Kelly, T. (2003). Establishing safety criteria for artificial neural networks. InKnowledge-Based Intelligent Information and Engineering Systems, pages 163-169. Springer.
- Kurd, Z., Kelly, T., and Austin, J. (2003). Safety criteria and safety lifecycle for artificial neural networks. In Proc. of Eunite, volume 2003.
- Lucas, B. D., Kanade, T., et al. (1981). An iterative image registration technique with an application to stereo vision. In IJCAI, volume 81, pages 674-679.
- Mekki-Mokhtar, A., Blanquart, J.-P., Guiochet, J., Powell, D., and Roy, M. (2012). Safety trigger conditions for critical autonomous systems. In 18th Pacific Rim International Symposium on Dependable Computing, pages 61-69. IEEE.
- Mitka, E., Gasteratos, A., Kyriakoulis, N., and Mouroutsos, S. G. (2012). Safety certification requirements for domestic robots. Safety science, 50(9):1888-1897.
- Murphy, R. R. and Hershberger, D. (1999). Handling sensing failures in autonomous mobile robots. The International Journal of Robotics Research, 18(4):382- 400.
- Nguyen, A., Yosinski, J., and Clune, J. (2015). Deep neural networks are easily fooled: High confidence predictions for unrecognizable images. In Conference on Computer Vision and Pattern Recognition (CVPR), pages 427-436. IEEE.
- Saito, T. and Rehmsmeier, M. (2015). The precision-recall plot is more informative than the roc plot when evaluating binary classifiers on imbalanced datasets. In PLoS ONE, pages 1-21.
- Santosuosso, A., Boscarato, C., Caroleo, F., Labruto, R., and Leroux, C. (2012). Robots, market and civil liability: A european perspective. In RO-MAN, pages 1051-1058. IEEE.
- Schumann, J., Gupta, P., and Liu, Y. (2010). Application of neural networks in high assurance systems: A survey. In Applications of Neural Networks in High Assurance Systems, pages 1-19. Springer.
- Szegedy, C., Zaremba, W., Sutskever, I., Bruna, J., Erhan, D., Goodfellow, I., and Fergus, R. (2013). Intriguing properties of neural networks. arXiv preprint arXiv:1312.6199.
- TC 127 (2015). Earth-moving machinery - autonomous machine system safety. International Standard ISO 17757-2015, International Organization for Standardization.
- TC 184 (2014). Robots and robotic devices - Safety requirements for personal care robots. International Standard ISO 13482:2014, International Organization for Standardization.
- TC 23 (2010). Tractors and machinery for agriculture and forestry - safety-related parts of control systems. International Standard ISO 25119-2010, International Organization for Standardization.
- TC 23 (2014). Agricultural machinery and tractors - Safety of highly automated machinery. International Standard ISO/DIS 18497, International Organization for Standardization.
- TC 23 (2015). Standards. International standard, International Organization for Standardization.
- TC 44 (2012). Safety of machinery - electro-sensitive protective equipment. International Standard IEC 61496- 2012, International Electronical Commission.
- Torralba, A. and Efros, A. A. (2011). Unbiased look at dataset bias. In Conference on Computer Vision and Pattern Recognition (CVPR), pages 1521-1528. IEEE.
- Wang, R. and Bhanu, B. (2005). Learning models for predicting recognition performance. In Tenth IEEE International Conference on Computer Vision (ICCV), volume 2, pages 1613-1618. IEEE.
- Yang, L. and Noguchi, N. (2012). Human detection for a robot tractor using omni-directional stereo vision. Computers and Electronics in Agriculture, 89:116- 125.
- Zendel, O., Murschitz, M., Humenberger, M., and Herzner, W. (2015). Cv-hazop: Introducing test data validation for computer vision. In Proceedings of the IEEE International Conference on Computer Vision, pages 2066-2074.
- Zenke, D., Listner, D. J., and Author, A. (2016). Meeting on safety in sensor systems with employees from T ÜV NORD.
- Zhang, P., Wang, J., Farhadi, A., Hebert, M., and Parikh, D. (2014). Predicting failures of vision systems. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 3566-3573.
Paper Citation
in Harvard Style
Ingibergsson J., Kraft D. and Pagh Schultz U. (2017). Explicit Image Quality Detection Rules for Functional Safety in Computer Vision . In Proceedings of the 12th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 6: VISAPP, (VISIGRAPP 2017) ISBN 978-989-758-227-1, pages 433-444. DOI: 10.5220/0006125604330444
in Bibtex Style
@conference{visapp17,
author={Johann Thor Mogensen Ingibergsson and Dirk Kraft and Ulrik Pagh Schultz},
title={Explicit Image Quality Detection Rules for Functional Safety in Computer Vision},
booktitle={Proceedings of the 12th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 6: VISAPP, (VISIGRAPP 2017)},
year={2017},
pages={433-444},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0006125604330444},
isbn={978-989-758-227-1},
}
in EndNote Style
TY - CONF
JO - Proceedings of the 12th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 6: VISAPP, (VISIGRAPP 2017)
TI - Explicit Image Quality Detection Rules for Functional Safety in Computer Vision
SN - 978-989-758-227-1
AU - Ingibergsson J.
AU - Kraft D.
AU - Pagh Schultz U.
PY - 2017
SP - 433
EP - 444
DO - 10.5220/0006125604330444