for an automated fusion of different low-cost 3D sen-
sors. Sensors, 14(4):7563–7579.
El-laithy, R., Huang, J., and Yeh, M. (2012). Study
on the use of microsoft kinect for robotics applica-
tions. In Position Location and Navigation Sympo-
sium (PLANS), pages 1280–1288.
Fossati, A., Gall, J., Grabner, H., Ren, X., and Konolige, K.,
editors (2013). Consumer Depth Cameras for Com-
puter Vision: Research Topics and Applications (Ad-
vances in Computer Vision and Pattern Recognition).
Springer.
Gallo, L., Placitelli, A., and Ciampi, M. (2011). Controller-
free exploration of medical image data: Experienc-
ing the Kinect. In Computer-Based Medical Systems
(CBMS).
Handa, A., Whelan, T., McDonald, J., and Davison, A. J.
(2014). A benchmark for RGB-D visual odometry, 3D
reconstruction and SLAM. International Conference
on Robotics and Automation.
Henry, P., Krainin, M., Herbst, E., Ren, X., and Fox, D.
(2014). RGB-D mapping: Using depth cameras for
dense 3D modeling of indoor environments. In Ex-
perimental Robotics, pages 477–491. Springer.
Heptagon Micro Optics (2015). SR4500 data sheet. http://
downloads.mesa-imaging.ch/dlm.php?fname=pdf/
SR4500 DataSheet.pdf/.
Huang, A. S., Bachrach, A., Henry, P., Krainin, M., Fox, D.,
and Roy, N. (2011). Visual odometry and mapping for
autonomous flight using an RGB-D camera. In Inter-
national Symposium of Robotics Research (ISRR).
Ihrke, I., Kutulakos, K. N., Lensch, H., Magnor, M., and
Heidrich, W. (2010). Transparent and specular object
reconstruction. In Computer Graphics Forum, vol-
ume 29, pages 2400–2426. Wiley Online Library.
Intel Corporation (2015a). Intel RealSense product brief.
https://software.intel.com/sites/default/files/managed/
0f/b0/IntelRealSense-WindowsSDKGold PB 1114-
FINAL.pdf.
Intel Corporation (2015b). RealSense 3D from lab to real-
ity. http://iq-realsense.intel.com/from-lab-to-reality/.
Khoshelham, K. and Elberink, S. O. (2012). Accuracy and
resolution of Kinect depth data for indoor mapping ap-
plications. Sensors, 12(2):1437–1454.
Lee, S.-O., Lim, H., Kim, H.-G., and Ahn, S. C. (2014).
RGB-D fusion: Real-time robust tracking and dense
mapping with RGB-D data fusion. In Intelligent
Robots and Systems (IROS), pages 2749–2754.
Microsoft (2015a). Kinect 2 for Windows tech-
nical datasheet. http://www.microsoft.com/en-us/
kinectforwindows/meetkinect/features.aspx.
Microsoft (2015b). Kinect for Windows technical
datasheet. https://readytogo.microsoft.com/en-us/
layouts/RTG/AssetViewer.aspx?AssetUrl=https%3A
%2F%2Freadytogo.microsoft.com%2Fen-us
%2FAsset%2FPages%2F08%20K4W%20Kinect
%20for%20Windows Technical%20Datasheet.aspx.
Microsoft Research (2015). 3D surface reconstruction.
http://research.microsoft.com/en-us/projects/
surfacerecon/.
Newcombe, R. A., Izadi, S., Hilliges, O., Molyneaux, D.,
Kim, D., Davison, A. J., Kohi, P., Shotton, J., Hodges,
S., and Fitzgibbon, A. (2011). KinectFusion: Real-
time dense surface mapping and tracking. In Interna-
tional Symposium on Mixed and Augmented Reality
(ISMAR), pages 127–136.
Nguyen, T. V., Feng, J., and Yan, S. (2014). Seeing hu-
man weight from a single RGB-D image. Journal of
Computer Science and Technology, 29(5):777–784.
Occipital, Inc (2015). Structure Sensor & SDK fact
sheet. http://io.structure.assets.s3.amazonaws.com/
Structure%20Sensor%20Press%20Kit.zip.
Papon, J., Abramov, A., Schoeler, M., and W
¨
org
¨
otter, F.
(2013). Voxel cloud connectivity segmentation - su-
pervoxels for point clouds. In Computer Vision and
Pattern Recognition (CVPR), pages 2027–2034.
PMD Technologies GmbH (2015). Reference design brief
CamBoard pico. http://www.pmdtec.com/html/pdf/
PMD RD Brief CB pico 71.19k V0103.pdf.
Sch
¨
oning, J. (2015). Interactive 3D reconstruction: New op-
portunities for getting CAD-ready models. In Imperial
College Computing Student Workshop (ICCSW), vol-
ume 49 of OpenAccess Series in Informatics (OASIcs),
pages 54–61. Schloss Dagstuhl–Leibniz-Zentrum fuer
Informatik.
Sch
¨
oning, J. and Heidemann, G. (2015). Interactive 3D
modeling - a survey-based perspective on interac-
tive 3D reconstruction. In International Conference
on Pattern Recognition Applications and Methods
(ICPRAM), volume 2, pages 289–294. SCITEPRESS.
Wunder, E., Linz, A., Ruckelshausen, A., and Trab-
hardt, A. (2014). Evaluation of 3D-sensorsystems
for service robotics in orcharding and viticulture.
In VDI-Conference ”Agricultural Engineering” VDI-
Berichte Nr. 2226, pages 83–88. VDI-Verlag GmbH
D
¨
usseldorf.
Yip, H. M., Ho, K. K., Chu, M., and Lai, K. (2014). De-
velopment of an omnidirectional mobile robot using a
RGB-D sensor for indoor navigation. In Cyber Tech-
nology in Automation, Control, and Intelligent Sys-
tems (CYBER), pages 162–167.
Zollh
¨
ofer, M., Theobalt, C., Stamminger, M., Nießner, M.,
Izadi, S., Rehmann, C., Zach, C., Fisher, M., Wu, C.,
Fitzgibbon, A., and et al. (2014). Real-time non-rigid
reconstruction using an RGB-D camera. ACM Trans-
actions on Graphics, 33(4):1–12.
Taxonomy of 3D Sensors - A Survey of State-of-the-Art Consumer 3D-Reconstruction Sensors and their Field of Applications
199