A Unified Approach to Calibrate a Network of Camcorders and ToF cameras

Li Guan Marc Pollefeys, 2008

pdficon_largeIn this paper, we propose a unified calibration technique for a heterogeneous sensor network of video camcorders and Time-of-Flight (ToF) cameras. By moving a spherical calibration target around the commonly observed scene, we can robustly and conveniently extract the sphere centers in the observed images and recover the geometric extrinsics for both types of sensors. The approach is then evaluated with a real dataset of two HD camcorders and two ToF cameras, and 3D shapes are reconstructed from this calibrated system. The main contributions are: (1) We reveal the fact that the frontmost sphere surface point to the ToF camera center is always highlighted, and use this idea to extract sphere centers in the ToF camera images; (2) We propose a unified calibration scheme in spite of the heterogeneity of the sensors. After the calibration, this multi-modal sensor network thus becomes powerful to generate high-quality 3D shapes efficiently.


3D Object Reconstruction with Heterogeneous Sensor Data

Li Guan, Jean-Sebastien Franco, Marc Pollefeys, 2008

pdficon_largeIn this paper, we reconstruct 3D objects with a heterogeneous sensor network of Time of Flight (ToF) Range Imaging (RIM) sensors and high-res camcorders. With this setup, we first carry out a simple but effective depth calibration for the RIM cameras. We then combine the camcorder silhouette cues and RIM camera depth information, for the reconstruction. Our main contribution is the proposal of a sensor fusion framework so that the computation is general, simple and scalable. Although we only discuss the fusion of conventional cameras and RIM cameras in this paper, the proposed framework can be applied to any vision sensors. This framework uses a space occupancy grid as a probabilistic 3D representation of scene contents. After defining sensing models for each type of sensors, the reconstruction simply is a Bayesian inference problem, and can be solved robustly. The experiments show that the quality of the reconstruction is substantially improved from the noisy depth sensor measurement.

Graffiti Detection Using a Time-Of-Flight Camera

Federico Tombari, Luigi Di Stefano, Stefano Mattoccia, and Andrea Zanetti, 2008

pdficon_largeTime-of-Flight (TOF) cameras relate to a very recent and growing technology which has already proved to be useful for computer vision tasks. In this paper we investigate on the use of a TOF camera to perform video-based graffiti detection, which can be thought of as a monitoring system able to detect acts of vandalism such as dirtying, etching and defacing walls and objects surfaces. Experimental results show promising capabilities of the proposed approach, with improvements expected as the technology gets more mature.

Visual Tracking Using Color Cameras and Time-of-Flight Range Imaging Sensors

Leila Sabeti, Ehsan Parvizi, Q.M. Jonathan Wu, 2008

pdficon_largeThis work proposes two particle filter-based visual trackers — one using output images from a color camera and the other using images from a time-of-flight range imaging sensor. These proposed trackers were compared in order to identify the advantages and drawbacks of utilizing output images from the color camera as opposed to output from the time-of-flight range imaging sensor for the most efficient visual tracking. This paper is also unique in its novel mixture of efficient methods to produce two stable and reliable human trackers using the two cameras.

Calibration of a PMD camera using a planar calibration object together with a multi-camera setup

Ingo Schiller, Christian Beder and Reinhard Koch, 2008

pdficon_largeWe discuss the joint calibration of novel 3D range cameras based on the time-of-flight principle with the Photonic Mixing Device (PMD) and standard 2D CCD cameras. Due to the small field-of-view (fov) and low pixel resolution, PMD-cameras are difficult to calibrate with traditional calibration methods. In addition, the 3D range data contains systematic errors that need to be compensated. Therefore, a calibration method is developed that can estimate full intrinsic calibration of the PMD-camera including optical lens distortions and systematic range errors, and is able to calibrate the external orientation together with multiple 2D cameras that are rigidly coupled to the PMD-camera. The calibration approach is based on a planar checkerboard pattern as calibration reference, viewed from multiple angles and distances. By combining the PMD-camera with standard CCD-cameras the internal camera parameters can be estimated more precisely and the limitations of the small fov can be overcome. Furthermore we use the additional cameras to calibrate the systematic depth measurement error of the PMD-camera. We show that the correlation between rotation and translation estimation is significantly reduced with our method.

Real-Time Estimation of the Camera Path from a Sequence of Intrinsically Calibrated PMD Depth Images

Christian Beder and Ingo Schiller and Reinhard Koch, 2008

pdficon_largeIn recent years real time active 3D range cameras based on time-of-flight technology using the Photonic-Mixer-Device (PMD) have been developed. Those cameras produce sequences of low-resolution depth images at frame rates comparable to regular video cameras. Hence, spatial resolution is traded against temporal resolution compared to standard laser scanning techniques. In this work an algorithm is proposed, which allows to reconstruct the camera path of a moving PMD depth camera. A constraint describing the relative orientation between two calibrated PMD depth images will be derived. It will be shown, how this constraint can be used to efficiently estimate a camera trajectory from a sequence of depth images in real-time. The estimation of the trajectory of the PMD depth camera allows to integrate the depth measurements over a long sequence taken from a moving platform. This increases the spatial resolution and enables interactive scanning objects with a PMD camera in order to obtain a dense 3D point cloud.