Paper
30 April 1992 Sensor planning in an active robotic work cell
Steven Abrams, Peter K. Allen
Author Affiliations +
Abstract
In this paper, we discuss techniques for extending the sensor planning capabilities of the machine vision planning system to include motion in a well-known environment. In a typical work cell, vision sensors are needed to monitor a task and provide feedback to motion control programs or to assess task completion or failure. In planning sensor locations and parameters for such a work-cell, all motion in the environment must be taken into account in order to avoid occlusions of desired features by moving objects and, in the case where the features to be monitored are being manipulated by the robot, to insure that the features are always within the camera's view. Several different sensor locations (or a single, movable sensor) may be required in order to view the features of interest during the course of the task. The goal is to minimize the number of sensors (or to minimize the motion of the single sensor) while guaranteeing a robust view at all times during the task, where a robust view is one which is unobstructed, in focus, and sufficiently magnified. In the past, sensor planning techniques have primarily focused on static environments. We present techniques which we have been exploring to include knowledge of motion in the sensor planning problem. Possible directions for future research are also presented.
© (1992) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Steven Abrams and Peter K. Allen "Sensor planning in an active robotic work cell", Proc. SPIE 1611, Sensor Fusion IV: Control Paradigms and Data Structures, (30 April 1992); https://doi.org/10.1117/12.57928
Lens.org Logo
CITATIONS
Cited by 5 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Sensors

Environmental sensing

Cameras

Visibility

Sensor fusion

Detection and tracking algorithms

3D modeling

RELATED CONTENT

Real-time pose invariant logo and pattern detection
Proceedings of SPIE (January 24 2011)
Parallel algorithms for real-time tracking
Proceedings of SPIE (October 31 1996)
Tracking subpixel targets in domestic environments
Proceedings of SPIE (May 19 2006)
Autonomous mobility for Demo III
Proceedings of SPIE (January 08 1999)

Back to Top