IEEE Sensors Journal,
Volume 17,
pgs. 2640-2641,
2017
This letter proposes a framework to perform 3D reconstruction using a heterogeneous sensor network, with potential use in augmented reality (AR), human behavior understanding, smart-room implementations, robotics, and many other applications. We fuse orientation measurements from inertial sensors, images from cameras and depth data from Time of Flight (ToF) sensors within a probabilistic framework in a synergistic manner to obtain robust reconstructions. A fully probabilistic method is proposed to efficiently fuse the multi-modal data of the system.