mean is taken over the six vehicles used in each
case. The blue graph represented the nominal
camera aperture, the green curve represented a
camera aperture at 0.25 of the nominal setting, the
red curve represented a camera aperture at 0.1340 of
the nominal setting, and the cyan curve represents a
camera aperture at 0.0625 of the nominal setting. As
seen in the flatness of the curves in the figure, the
face classification score only weakly depended on
exposure duration in this experiment.
Figure 6: Face classification score versus camera exposure
duration for four aperture value.
This is at least partly due to xenon flash
illuminators having very short illumination duration.
Most of the flash tube illumination energy is
produced during the first 0.5 msec. Thus holding the
camera exposure open for longer than this duration
may not yield much more light energy, and may
increase the noise.
5 CONCLUSIONS
An experiment to investigate the illumination
variation robustness properties of a face
classification method was performed. Camera
parameters of aperture and exposure were varied to
examine the low light performance of the face
classification method. It was found that camera
aperture was a significant factor in accounting for
the variation in face classification score during the
experiment, while exposure duration while using a
xenon flash illuminator was found to be not
statistically significant in explaining variations in
face classification score. A reduction in illumination
to 25% of that used to generate a properly exposed
image to a human observer was found to produce
adequate face classification scores. A further
reduction to 13% may or may not be usable,
depending on the application. Further, it was found
that in this experiment, when using a xenon flash
illuminator and training on a single exposure
duration, increased exposure duration did not make
up for lost illumination due to camera aperture, or
equivalently, the due to a smaller illuminator.
REFERENCES
Birch, P. M., Young, R. C. D., Claret-Tournier, F.,
Chatwin, C. R., 2004. Automated vehicle occupancy
monitoring. Optical Engineering 43(8), pp. 1828-
1832.
Billheimer, J., Kaylor, K., Shade, C., 1990. Use of
videotape in HOV lane surveillance and enforcement.
Technical Report, U.S. Department of Transportation.
Hao, X., Chen, H., Yao, C., Yang, N., Bi, H., Wang, C.,
2010. A near-infrared imaging method for capturing
the interior of a vehicle through windshield. IEEE
SSIAI 2010, pp. 109-112.
Nilsson, M., Nordberg, J., Claesson, I., 2007. Face
Detection using local SMQT Features and Split Up
SNoW Classifier, IEEE Int. Conference on Acoustics,
Speech, and Signal Processing, Vol. 2, pp. 589-592.
Viola, P., Jones, M. J., 2004. Robust Real-Time Face
Detection. Int. Journal of Computer Vision 57(2), pp
137-154.
Wood, J. W., Gimmestad, G. G., Roberts, D. W., 2003.
Covert Camera for Screening of Vehicle Interiors and
HOV Enforcement. Proc. SPIE – The International
Society for Optical Engineering, vol. 5071, pp. 411-
420.
Yang, M.-H., Roth, D., Ahuja, 2000, N., A. SNoW-Based
Face Detector. Advances in Neural Information
Processing Systems 12, S. A. Solla, T. K. Leen, and
K.-R. Muller, eds., pp. 855-861. MIT Press.
0.5 1 1.5 2
0
50
100
150
200
250
Exposure Duration (msec)
Face Classification Score
Face Classification Score versus Camera Exposure Duration
EXPERIMENTS ON FACIAL CLASSIFICATION IN LOW LIGHT CONDITIONS
741