KEYWORDS: Sensors, Navigation systems, Cameras, Global Positioning System, Detection and tracking algorithms, Bone, Robotic systems, Machine vision, 3D modeling, RGB color model
As part of the TARDEC-funded CANINE (Cooperative Autonomous Navigation in a Networked Environment)
Program, iRobot developed LABRADOR (Learning Autonomous Behavior-based Robot for Adaptive Detection and
Object Retrieval). LABRADOR was based on the rugged, man-portable, iRobot PackBot unmanned ground vehicle
(UGV) equipped with an explosives ordnance disposal (EOD) manipulator arm and a custom gripper. For
LABRADOR, we developed a vision-based object learning and recognition system that combined a TLD (track-learn-detect)
filter based on object shape features with a color-histogram-based object detector. Our vision system was able to
learn in real-time to recognize objects presented to the robot. We also implemented a waypoint navigation system based
on fused GPS, IMU (inertial measurement unit), and odometry data. We used this navigation capability to implement
autonomous behaviors capable of searching a specified area using a variety of robust coverage strategies – including
outward spiral, random bounce, random waypoint, and perimeter following behaviors. While the full system was not
integrated in time to compete in the CANINE competition event, we developed useful perception, navigation, and
behavior capabilities that may be applied to future autonomous robot systems.
We introduce a new framework, Model Transition Control (MTC), that models robot control problems as sets of linear
control regimes linked by nonlinear transitions, and a new learning algorithm, Dynamic Threshold Learning (DTL), that
learns the boundaries of these control regimes in real-time. We demonstrate that DTL can learn to prevent understeer and
oversteer while controlling a simulated high-speed vehicle. We also show that DTL can enable an iRobot PackBot to
avoid rollover in rough terrain and to actively shift its center-of-gravity to maintain balance when climbing obstacles. In
all cases, DTL is able to learn control regime boundaries in a few minutes, often with single-digit numbers of learning
trials.
Currently deployed small UGVs operate at speeds up to around 6 mph and have proven their usefulness in explosives
ordnance disposal (EOD) missions. As part of the TARDEC-funded Stingray Project, iRobot is investigating techniques
to increase the speed of small UGVs so they can be useful in a wider range of missions, such as high-speed
reconnaissance and infantry assault missions. We have developed a prototype Stingray PackBot, using wheels rather
than tracks, that is capable of traveling at speeds up to 18 mph. A key issue when traveling at such speeds is how to
maintain stability during sharp turns and over rough terrain. We are developing driver assist behaviors that will provide
dynamic stability control for high-speed small UGVs using techniques such as dynamic weight shifting to limit oversteer
and understeer. These driver assist behaviors will enable operators to use future high-speed small UGVs in high
optempo infantry missions and keep warfighters out of harm's way.
Autonomous small UGVs have the potential to greatly increase force multiplication capabilities for infantry units. In
order for these UGVs to be useful on the battlefield, they must be able to operate under all-weather conditions. For the
Daredevil Project, we have explored the use of ultra-wideband (UWB) radar, LIDAR, and stereo vision for all-weather
navigation capabilities. UWB radar provides the capability to see through rain, snow, smoke, and fog. LIDAR and
stereo vision provide greater accuracy and resolution in clear weather but has difficulty with precipitation and
obscurants. We investigate the ways in which the sensor data from UWB radar, LIDAR, and stereo vision can be
combined to provide improved performance over the use of a single sensor modality. Our research includes both
traditional sensor fusion, where data from multiple sensors is combined in a single representation, and behavior-based
sensor fusion, where the data from one sensor is used to activate and deactivate behaviors using other sensor modalities.
We use traditional sensor fusion to combine LIDAR and stereo vision for improved obstacle avoidance in clear air, and
we use behavior-based sensor fusion to select between radar-based and LIDAR/vision-based obstacle avoidance based
on current environmental conditions.
For the TARDEC-funded Stingray Project, iRobot Corporation and Chatten Associates are developing technologies that
will allow small UGVs to operate at tactically useful speeds. In previous work, we integrated a Chatten Head-Aimed
Remote Viewer (HARV) with an iRobot Warrior UGV, and used the HARV to drive the Warrior, as well as a small,
high-speed, gas-powered UGV surrogate. In this paper, we describe our continuing work implementing semiautonomous
driver-assist behaviors to help an operator control a small UGV at high speeds. We have implemented an
IMU-based heading control behavior that enables tracked vehicles to maintain accurate heading control even over rough
terrain. We are also developing a low-latency, low-bandwidth, high-quality digital video protocol to support immersive
visual telepresence. Our experiments show that a video compression codec using the H.264 algorithm can produce
several times better resolution than a Motion JPEG video stream, while utilizing the same limited bandwidth, and the
same low latency. With further enhancements, our H.264 codec will provide an order of magnitude greater quality,
while retaining a low latency comparable to Motion JPEG, and operating within the same bandwidth.
For the TARDEC-funded Daredevil Project, iRobot Corporation is developing capabilities that will allow small UGVs to
navigate autonomously in adverse weather and in foliage. Our system will fuse sensor data from ultra wideband (UWB)
radar, LIDAR, stereo vision, GPS, and INS to build maps of the environment showing which areas are passable (e.g.
covered by tall grass) and which areas must be avoided (i.e. solid obstacles). In Phase I of this project, we demonstrated
that UWB radar sensors can see through precipitation, smoke/fog, and foliage and detect solid obstacles. In Phase II, we
are integrating all of these sensors with an iRobot PackBot. By the end of Phase II, we will demonstrate a fully-autonomous
Daredevil PackBot that can avoid obstacles, build maps, and navigate to waypoints in all-weather
conditions and through foliage.
We developed and demonstrated a UAV package that works in conjunction with the PackBot UGV to allow medium
range missions. Both the UAV and UGV are man portable, and the combined system can be used from unimproved
airfields such as soccer pitches. The UAV is capable of up to 75lbs of payload, while weighing less than 30lbs. This
document describes the initial proof of concept prototype, the associated ground and flight tests, and areas for further
development.
We are developing an ultra wideband (UWB) radar sensor payload for the man-portable iRobot PackBot UGV. Our goal
is to develop a sensor array that will allow the PackBot to navigate autonomously through foliage (such as tall grass)
while avoiding obstacles and building a map of the terrain. We plan to use UWB radars in conjunction with other
sensors such as LIDAR and vision. We propose an algorithm for using polarimetric (dual-polarization) radar arrays to
classify radar returns as either vertically-aligned foliage or solid objects based on their differential reflectivity, a function
of their aspect ratio. We have conducted preliminary experiments to measure the ability of UWB radars to detect solid
objects through foliage. Our initial results indicate that UWB radars are very effective at penetrating sparse foliage, but
less effective at penetrating dense foliage.
KEYWORDS: Roads, Reconnaissance, Stereo vision systems, LIDAR, Hough transforms, Global Positioning System, 3D vision, Sensors, Prototyping, Digital signal processing
For the Wayfarer Project, funded by the US Army through TARDEC, we have developed technologies that enable manportable PackBot Wayfarer UGVs to perform autonomous reconnaissance in urban terrain. Each Wayfarer UGV can autonomously follow urban streets and building perimeters while avoiding obstacles and building a map of the terrain. Each UGV is equipped with a 3D stereo vision system, a 360-degree planar LIDAR, GPS, INS, compass, and odometry. The Hough transform is applied to LIDAR range data to detect building walls for street following and perimeter following. We have demonstrated Wayfarer's ability to autonomously follow roads in urban and rural environments, while building a map of the surrounding terrain. Recently, we have developed a ruggedized version of the Wayfarer
Navigation Payload for use in rough terrain and all-weather conditions. The new payload incorporates a compact Tyzx G2 stereo vision module and a high-performance Athena Guidestar INS/GPS unit.
We are currently developing autonomous urban navigation capabilities for the iRobot PackBot. The TARDEC-funded Wayfarer Project is developing a modular navigation payload that incorporates LIDAR, vision, FLIR, and inertial navigation sensors. This payload can be attached to any PackBot and will provide the robot with the capability to perform fully-autonomous urban reconnaissance missions. These capabilities will enable the PackBot Wayfarer to scout unknown territory and return maps along with video and FLIR image sequences. The Wayfarer navigation payload includes software components for obstacle avoidance, perimeter and street following, and map-building. The obstacle avoidance system enables the PackBot to avoid collisions with a wide range of obstacles in both outdoor and indoor environments. This system combines 360-degree planar LIDAR range data with 3D obstacle detection using stereo vision using a Scaled Vector Field Histogram algorithm. We use a real-time Hough transform to detect linear features in the range data that correspond to building walls and street orientations. We use the LIDAR range data to build an occupancy grid map of the robot's surroundings in real-time. Data from the range sensors, obstacle avoidance, and the Hough transform are transmitted via UDP over wireless Ethernet to an OpenGL-based OCU that displays this information graphically and in real-time.
The iRobot PackBot is a combat-tested, man-portable UGV that has been deployed in Afghanistan and Iraq. The PackBot is also a versatile platform for mobile robotics research and development that supports a wide range of payloads suitable for many different mission types. In this paper, we describe four R&D projects that developed experimental payloads and software using the PackBot platform. CHARS was a rapid development project to develop a chemical/radiation sensor for the PackBot. We developed the CHARS payload in six weeks and deployed it to Iraq to search for chemical and nuclear weapons. Griffon was a research project to develop a flying PackBot that combined the capabilities of a UGV and a UAV. We developed a Griffon prototype equipped with a steerable parafoil and gasoline-powered motor, and we completed successful flight tests including remote-controlled launch, ascent, cruising, descent, and landing. Valkyrie is an ongoing research and development project to develop a PackBot payload that will assist medics in retrieving casualties from the battlefield. Wayfarer is an applied research project to develop autonomous urban navigation capabilities for the PackBot using laser, stereo vision, GPS, and INS sensors.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.