Paper
25 October 2004 Bayesian stereo: 3D vision designed for sensor fusion
John Larson, Robert B. Pless
Author Affiliations +
Abstract
Classical stereo algorithms attempt to reconstruct 3D models of a scene by matching points between two images. Finding points that match is an important part of this process, and point matches are most commonly chosen as the minimum of an error function based on color or local texture. Here we motivate a probabilistic approach to this point matching problem, and provide an experimental design for the empirical measurement of the color matching error for corresponding points. We use this prior in a Bayesian scene reconstruction example, and show that we get better 3D reconstruction by not committing to a specific pixel match early in the visual processing. This allows a calibrated stereo camera to be considered as a probabilistic volume sensor -- which allows it to be more easily integrated with scene structure measurements from other kinds of sensors.
© (2004) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
John Larson and Robert B. Pless "Bayesian stereo: 3D vision designed for sensor fusion", Proc. SPIE 5608, Intelligent Robots and Computer Vision XXII: Algorithms, Techniques, and Active Vision, (25 October 2004); https://doi.org/10.1117/12.571537
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Cameras

Calibration

3D modeling

Error analysis

Visualization

3D vision

Sensor fusion

Back to Top