A method for the autonomous geolocation of ground vehicles in forest environments is discussed. The method provides an estimate of the global horizontal position of a vehicle strictly based on finding a geometric match between a map of observed tree stems, scanned in 3D by Light Detection and Ranging (LiDAR) sensors onboard the vehicle, to another stem map generated from the structure of tree crowns analyzed from high resolution aerial orthoimagery of the forest canopy. Extraction of stems from 3D data is achieved by using Support Vector Machine (SVM) classifiers and height above ground filters that separate ground points from vertical stem features. Identification of stems from overhead imagery is achieved by finding the centroids of tree crowns extracted using a watershed segmentation algorithm. Matching of the two maps is achieved by using a robust Iterative Closest Point (ICP) algorithm that determines the rotation and translation vectors to align the datasets. The alignment is used to calculate the absolute horizontal location of the vehicle. The method has been tested with real-world data and has been able to estimate vehicle geoposition with an average error of less than 2 m. It is noted that the algorithm’s accuracy performance is currently limited by the accuracy and resolution of aerial orthoimagery used. The method can be used in real-time as a complement to the Global Positioning System (GPS) in areas where signal coverage is inadequate due to attenuation by the forest canopy, or due to intentional denied access. The method has two key properties that are significant: i) It does not require a priori knowledge of the area surrounding the robot. ii) Uses the geometry of detected tree stems as the only input to determine horizontal geoposition.
The compact High Speed Scanning Lidar (HSSL) was designed to meet the requirements for a rover GN&C sensor. The
eye-safe HSSL's fast scanning speed, low volume and low power, make it the ideal choice for a variety of real-time and
non-real-time applications including:
3D Mapping;
Vehicle guidance and Navigation;
Obstacle Detection;
Orbiter Rendezvous;
Spacecraft Landing / Hazard Avoidance.
The HSSL comprises two main hardware units: Sensor Head and Control Unit. In a rover application, the Sensor Head
mounts on the top of the rover while the Control Unit can be mounted on the rover deck or within its avionics bay. An
Operator Computer is used to command the lidar and immediately display the acquired scan data.
The innovative lidar design concept was a result of an extensive trade study conducted during the initial phase of an
exploration rover program. The lidar utilizes an innovative scanner coupled with a compact fiber laser and high-speed
timing electronics. Compared to existing compact lidar systems, distinguishing features of the HSSL include its high
accuracy, high resolution, high refresh rate and large field of view. Other benefits of this design include the capability to
quickly configure scan settings to fit various operational modes.
Part of the requirements of the future Constellation program is to optimize lunar surface operations and
reduce hazards to astronauts. Toward this end, many robotic platforms, rovers in specific, are being sought
to carry out a multitude of missions involving potential EVA sites survey, surface reconnaissance, path
planning and obstacle detection and classification. 3D imaging lidar technology provides an enabling
capability that allows fast, accurate and detailed collection of three-dimensional information about the
rover's environment. The lidar images the region of interest by scanning a laser beam and measuring the
pulse time-of-flight and the bearing. The accumulated set of laser ranges and bearings constitutes the threedimensional
image.
As part of the ongoing NASA Ames research center activities in lunar robotics, the utility of 3D imaging
lidar was evaluated by testing Optech's ILRIS-3D lidar on board the K-10 Red rover during the recent
Human - Robotics Systems (HRS) field trails in Lake Moses, WA. This paper examines the results of the
ILRIS-3D trials, presents the data obtained and discusses its application in lunar surface robotic surveying
and scouting.
Airborne laser terrain mapping systems have redefined the realm of topographic mapping. Lidars with kilohertz
collection rates and long ranges have made airborne surveying a quick, efficient and highly productive endeavor. Despite
the current industry efforts toward improving airborne lidar range, collection rate, resolution and accuracies, and with the
advent of Unmanned Aerial Vehicles (UAVs) and their myriad advantages, military and civil applications alike are
looking for very compact and rugged lidar systems that can fit within the tight volumetric, form-factor, mass and power
constraints imposed by UAVs.
Optech has developed a very compact airborne laser terrain mapper that's geared toward UAV deployment. The system
is composed of a highly integrated unit that combines a lidar transceiver, a position orientation sensor and control
electronics in a 1 cubic foot - 57 lb package. Such level of compactness is achieved by employing the latest laser
technology trends along with featuring very compact optical design, and using the latest control and data collection
architecture technology. This paper describes the UAV requirements that drove the system design, the technology
employed and optimizations implemented in the system to achieve its ultra-compact size.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.