
Alexnet-level accuracy with 50x fewer parame-
ters and¡ 0.5 mb model size. arXiv preprint
arXiv:1602.07360.
Inman, R. R., Blumenfeld, D. E., Huang, N., and Li, J.
(2003). Designing production systems for quality: re-
search opportunities from an automotive industry per-
spective. International journal of production research,
41(9):1953–1971.
Jia, H., Murphey, Y. L., Shi, J., and Chang, T.-S. (2004).
An intelligent real-time vision system for surface de-
fect detection. In Proceedings of the 17th Inter-
national Conference on Pattern Recognition, 2004.
ICPR 2004., volume 3, pages 239–242. IEEE.
Knop, K. (2020). Indicating and analysis the interrelation
between terms-visual: management, control, inspec-
tion and testing. Production Engineering Archives, 26.
Kopardekar, P., Mital, A., and Anand, S. (1993). Manual,
hybrid and automated inspection literature and current
research. Integrated Manufacturing Systems.
Krizhevsky, A., Sutskever, I., and Hinton, G. E. (2012). Im-
agenet classification with deep convolutional neural
networks. Advances in neural information processing
systems, 25:1097–1105.
Li, W.-C. and Tsai, D.-M. (2012). Wavelet-based defect
detection in solar wafer images with inhomogeneous
texture. Pattern Recognition, 45(2):742–756.
Malamas, E. N., Petrakis, E. G., Zervakis, M., Petit, L., and
Legat, J.-D. (2003). A survey on industrial vision sys-
tems, applications and tools. Image and vision com-
puting, 21(2):171–188.
Mueller, R., Franke, J., Henrich, D., Kuhlenkoetter, B.,
Raatz, A., and Verl, A. (2019a). Handbuch Mensch-
Roboter-Kollaboration. Carl Hanser.
Mueller, R., Vette, M., Masiak, T., Duppe, B., and Schulz,
A. (2019b). Intelligent real time inspection of rivet
quality supported by human-robot-collaboration. SAE
Technical Paper, 2(2019-01-1886).
Nessle
˚
Asbrink, M. (2020). A case study of how industry
4.0 will impact on a manual assembly process in an
existing production system: Interpretation, enablers
and benefits.
Paszke, A., Gross, S., Massa, F., Lerer, A., Bradbury, J.,
Chanan, G., Killeen, T., Lin, Z., Gimelshein, N.,
Antiga, L., et al. (2019). Pytorch: An imperative style,
high-performance deep learning library. Advances in
neural information processing systems, 32.
Raabe, H., Myklebust, O., and Eleftheriadis, R. (2017). Vi-
sion based quality control and maintenance in high
volume production by use of zero defect strategies. In
International Workshop of Advanced Manufacturing
and Automation, pages 405–412. Springer.
Ren, Z., Fang, F., Yan, N., and Wu, Y. (2021). State of
the art in defect detection based on machine vision.
International Journal of Precision Engineering and
Manufacturing-Green Technology, pages 1–31.
Schneider, G., Masiak, T., Trampert, P., and Schmidt, F.
(2023). Pr
¨
ufen eines pr
¨
uflings, patent number 10 2021
210 572.6.
Schwab, K. (2017). The fourth industrial revolution. Cur-
rency.
See, J. E., Drury, C. G., Speed, A., Williams, A., and Kha-
landi, N. (2017). The role of visual inspection in the
21st century. In Proceedings of the Human Factors
and Ergonomics Society Annual Meeting, volume 61,
pages 262–266. SAGE Publications Sage CA: Los
Angeles, CA.
Shin, H.-C., Roth, H. R., Gao, M., Lu, L., Xu, Z., Nogues,
I., Yao, J., Mollura, D., and Summers, R. M. (2016).
Deep convolutional neural networks for computer-
aided detection: Cnn architectures, dataset charac-
teristics and transfer learning. IEEE transactions on
medical imaging, 35(5):1285–1298.
Soukup, D. and Huber-M
¨
ork, R. (2014). Convolutional neu-
ral networks for steel surface defect detection from
photometric stereo images. In International Sympo-
sium on Visual Computing, pages 668–677. Springer.
Tsai, D.-M., Wu, S.-C., and Chiu, W.-Y. (2012a). Defect de-
tection in solar modules using ica basis images. IEEE
Transactions on Industrial Informatics, 9(1):122–131.
Tsai, D.-M., Wu, S.-C., and Li, W.-C. (2012b). Defect de-
tection of solar cells in electroluminescence images
using fourier image reconstruction. Solar Energy Ma-
terials and Solar Cells, 99:250–262.
Venhuizen, N., Evang, K., Basile, V., and Bos, J. (2013).
Gamification for word sense labeling. In Proceed-
ings of the 10th International Conference on Compu-
tational Semantics (IWCS 2013).
Wightman, R. (2019). Pytorch image models. https:
//github.com/rwightman/pytorch-image-models.
Woitschek, F. and Schneider, G. (2022). Online black-box
confidence estimation of deep neural networks. In
33rd IEEE Intelligent Vehicles Symposium (IV22).
Wong, S. C., Gatt, A., Stamatescu, V., and McDonnell,
M. D. (2016). Understanding data augmentation for
classification: when to warp? In 2016 international
conference on digital image computing: techniques
and applications (DICTA), pages 1–6. IEEE.
Xu, Y., Jia, R., Mou, L., Li, G., Chen, Y., Lu, Y., and Jin, Z.
(2016). Improved relation classification by deep recur-
rent neural networks with data augmentation. arXiv
preprint arXiv:1601.03651.
Yazidi, K., Darmoul, S., and Hajri-Gabouj, S. (2018). Intel-
ligent product quality control and defect detection: A
case study. In 2018 International Conference on Ad-
vanced Systems and Electric Technologies (IC ASET),
pages 98–103. IEEE.
Yosinski, J., Clune, J., Bengio, Y., and Lipson, H. (2014).
How transferable are features in deep neural net-
works? In Advances in neural information processing
systems, pages 3320–3328.
Zhang, R., Tsai, P.-S., Cryer, J. E., and Shah, M. (1999).
Shape-from-shading: a survey. IEEE transactions on
pattern analysis and machine intelligence, 21(8):690–
706.
IMPROVE 2024 - 4th International Conference on Image Processing and Vision Engineering
66