condition. Rest of this section discusses sensitivity
of method to different operating parameters.
Figure 3: Experimental results – normal conditions.
5.1 Integration Time
Objects at different distances require different
integration (exposure) time to compute range. For
instance, experiments indicate the measurement
error for object at far distance is almost stable when
integration time set between 300 and 1000
microsecond. Our adaptive method of setting
integration time described in section 4.1 ensures
accurate results irrespective of object distance.
5.2 Background Complexity
As shown in Figure 4 (a), despite the presence of
other objects, algorithm successfully segments the
target and measures its dimension. In addition, we
conducted experiments testing performance against
high reflecting background which tends to saturate
pixels faster and thus necessitating appropriate
integration time selection. With adaptive integration
time setting procedure, such scenario has been
successfully handled as shown in Figure 4 (b).
(a) (b)
(c)
Figure 4: Results with complex conditions (a) multiple
objects (b) reflecting background (c) bright background.
Figure 4 (c) shows the performance of algorithm
when target object is placed under bright light. As
evident, the proposed method is insensitive to bright
background lighting condition. It is due to the fact
that algorithm uses amplitude image to extract object
edges and corners. Any processing based on
intensity image (which is sensitive to background
lighting) under such condition would have resulted
in inaccurate edges and corners.
6 CONCLUSIONS
A new algorithm for measuring 3D object geometry
was presented in this paper. We presented the
proposed approach with quantitative and qualitative
analysis with appropriate illustrations under normal
and challenging conditions. Ability of the proposed
approach to dynamically set integration time makes
it robust under difficult operating conditions. In
addition to using geometrical characteristics of
target effectively, the developed method exploits
different information sources (intensity image,
amplitude image and range image) in ensuring
accurate dimension measurement.
REFERENCES
Dorrington, A. A., Carnegie, D. A. and Cree, M. J., 2006.
“Toward 1-mm depth precision with a solid state full-
field range imaging system,” .Proc. SPIE, Vol. 6068 –
Sensors, Cameras, and Systems for
Scientific/Industrial Applications VIII, part of the
IS&T/SPIE Symposium on Electronic Imaging, San
Jose, CA, USA, pp. 60680K1–60680K10.
Cui Y., S. Schuon, D. Chan, S. Thrun, and C. Theobalt,
2010 “3d shape scanning with a time-of-flight
camera,” in IEEE CVPR 10, pp. 1173–1180.
Distante C., G. Diraco, and A. Leone, 2010, “Active range
imaging dataset for indoor surveillance,” Annals of the
BMVA, London, vol. 3, pp. 1–16.
Bostelman R., P. Russo, J. Albus, T. Hong, and R.
Madhavan, 2006, “Applications of a 3d range camera
towards healthcare mobility aids,” IEEEInternational
Conference on Networking, Sensing and Control
(ICNSC’06), pp. 416–421.
Chiabrando F., R. Chiabrando, D. Piatti, and F. Rinaudo,
2009, “Sensors for 3d imaging: Metric evaluation and
calibration of a ccd/cmos time-of-flight camera,”
Sensors, vol. 9.
Sobers L X Francis, Sreenatha G Anavatti, Matthew
Garratt, 2011, “Reconstructing the geometry of an
object using 3D TOF Camera”, Merging Fields Of
Computational Intelligence And Sensor Technology
(CompSens), 2011 IEEE Workshop On
PARAMETER ACTUAL(inm) MEASURED(inm) ERROR
HEIG HT 0.34 0.32305 0.01695
WIDTH 0.55 0.579384 ‐0.29384
DEPTH 0.55 0.548208 0.001792
PARAMETER ACTUAL(inm) MEASURED(inm) ERROR
HEIGHT 0.41 0.361635 0.048365
WIDTH 0.4 0.391544 0.008456
DEPTH 0.205 0.18906
0.01593
PARAMETER ACTUAL(inm) MEASURED(inm) ERROR
HEIGHT 0.295 0.278583 0.016417
WIDTH 0.365 0.358162 0.006838
DEPTH 0.545 0.544632 0. 000368
PARAMETER ACTUAL(inm) MEASURED(inm) ERROR
HEIGHT 0.205 0.18461 0.02039
WIDTH 0.405 0.399843 0.005157
DEPTH 0.4 0.423643 ‐0.023643
PARAMETER ACTUAL(inm) MEASURED(inm) ERROR
HEIGHT 0.305 0.316303 ‐0.011303
WIDTH 0.39 0.425641 ‐0.035641
DEPTH 0.395 0.413984 ‐0.018984
VISAPP 2012 - International Conference on Computer Vision Theory and Applications
452