Authors:
Daniel Lehmann
and
Marc Ebner
Affiliation:
Institut für Mathematik und Informatik, Universität Greifswald, Walther-Rathenau-Straße 47, 17489 Greifswald, Germany
Keyword(s):
CNN, Out-of-Distribution Detection, Clustering.
Abstract:
A convolutional neural network model is able to achieve high classification performance on test samples at inference, as long as those samples are drawn from the same distribution as the samples used for model training. However, if a test sample is drawn from a different distribution, the performance of the model decreases drastically. Such a sample is typically referred to as an out-of-distribution (OOD) sample. Papernot and McDaniel (2018) propose a method, called Deep k-Nearest Neighbors (DkNN), to detect OOD samples by a credibility score. However, DkNN are slow at inference as they are based on a kNN search. To address this problem, we propose a detection method that uses clustering instead of a kNN search. We conducted experiments with different types of OOD samples for models trained on either MNIST, SVHN, or CIFAR10. Our experiments show that our method is significantly faster than DkNN, while achieving similar performance.