Figure 10: The direct reflection off the wall (blue, solid line)
provides the correct range measurement D. This measure-
ment, however, will be corrupted by other signals (e.g. the
red, dotted line), which will have a larger phase shift as they
have traveled a longer distance. This is known as the multi-
path interference error.
lens sensor
Figure 11: The scattering effect of a ray of light in a TOF
camera. The incoming ray is not completely absorbed by
the pixel but partially scattered. Part of the scattered en-
ergy reflects on the lens surface and back onto the sensor,
disturbing the measurements.
lute differences on the background are substantially
greater than the expected noise. These errors are due
to multi-path interference (figure 10) and scattering
(figure 11).
7 FUTURE WORK
While the current hardware setup has a small field of
view, a prototype with a larger field of view is be-
ing developed. This will allow to monitor a reason-
ably large volume. The calibration routine also al-
lows to fuse data from multiple sensors to extend it
even more.
We intend to integrate the calibration of the intrin-
sic and extrinsic parameters of the combined sensor
into one automatic procedure. To determine the point
spread functions for both the IR and TOF cameras,
a system allowing to systematically repositioning the
calibration target using a robot will be set up. The
generated data will provide a better understanding of
the multi-path interference and scattering errors ob-
served in our experiments. Error compensation meth-
ods such as (Karel et al., 2012) (Mure-Dubois and
H¨ugli, 2007) allow to improve the reliability in seg-
menting moving objects and will increase correspon-
dence accuracy between the IR and range data.
A set of fused range and IR data will be generated
and used to train people detection algorithms. The
three main hypotheses mentioned in the introduction
will be thoroughly tested. Applying 3D tracking al-
gorithms will increase the robustness and enable the
system to cope with occlusion.
8 CONCLUSIONS
A combined sensor for the detection of people us-
ing fused geometric and infrared radiation data was
introduced. We explained the working principles of
both sensors and illustrated and addressed some im-
portant accuracy issues that arose during experiments.
A method to calibrate a system with known relative
position and unknown relative orientation was pro-
posed. Three key areas in people detection that could
benefit greatly from the fused IR and range data were
determined and will be investigated in future work.
ACKNOWLEDGEMENTS
We thank ICRealisations for their input and for pro-
viding a prototype system for experimental work, and
Xenics for providing a thermal infrared camera for
verification purposes.
REFERENCES
Doll´ar, P., Wojek, C., Schiele, B., and Perona, P. (2011).
Pedestrian detection: An evaluation of the state of the
art. PAMI, 99.
Enzweiler, M. and Gavrila, D. M. (2009). Monocular pedes-
trian detection: Survey and experiments. TPAMI,
31(12):2179–2195.
Gandhi, T. and Trivedi, M. (2007). Pedestrian protec-
tion systems: Issues, survey, and challenges. Intel-
ligent Transportation Systems, IEEE Transactions on,
8(3):413–430.
Hanning, T., Lasaruk, A., and Tatschke, T. (2011). Calibra-
tion and low-level data fusion algorithms for a parallel
2d/3d-camera. Information Fusion, 12(1):37 – 47.
Karel, W., Ghuffar, S., and Pfeifer, N. (2012). Modelling
and compensating internal light scattering in time of
flight range cameras. The Photogrammetric Record,
27(138):155–174.
Liang, F., Wang, D., Liu, Y., Jiang, Y., and Tang, S. (2012).
Fast pedestrian detection based on sliding window fil-
tering. In Proc. PCM 2012, pages 811–822, Berlin,
Heidelberg. Springer-Verlag.
McFarlane, N. and Schofield, C. (1995). Segmentation and
tracking of piglets in images. Machine Vision and Ap-
plications, 8(3):187–193.
Mure-Dubois, J. and H¨ugli, H. (2007). Real-time scattering
compensation for time-of-flight camera. Proceedings
ExploringthePotentialofCombiningTimeofFlightandThermalInfraredCamerasforPersonDetection
469