Off-Road Navigation Under Sensing Uncertainty
Loading...
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Off-road autonomous robots navigate through unstructured environments using onboard computation and sensing without prior maps or adherence to conventional traffic rules. This challenges these systems to efficiently determine where they can and can't drive on the fly. Diverse testing environments and incomplete sensing due to occlusions or limited sensor range make it difficult to accurately understand the environment. A motion planner in this context focuses on predicting the robot's actions seconds to minutes ahead, considering environmental obstacles, vehicle dynamics, and unknown regions. Existing systems that ignore uncertain or incomplete sensing are prone to dangerous decisions from over-optimism or to unnecessarily inefficient paths from over-pessimism. This thesis aims to address those shortcomings by explicitly reasoning about sensing uncertainty. This dissertation proposes three strategies for tractably reasoning over and reducing uncertainty in the context of off-road autonomy. We discuss the pitfalls of not reasoning about uncertainty and present a simple local uncertainty-aware algorithm that traverses safer, more efficient paths. Then we consider uncertainty outside the local map and leverage sparse long-range sensor information to guide planning. We show this algorithm makes less myopic earlier decisions to avoid distant obstacles. Finally, to reduce uncertainty, we introduce a technique that harnesses off-the-shelf video footage to rapidly adapt both perception and planning for new environments, achieving performance comparable to that obtained from costly, hard-to-collect robotic data. The research is grounded heavily in real-world applicability with the objective of minimizing time to reach the goal and number of safety interventions. Conducted during the participation in the DARPA RACER program, this research addresses the real-world challenges that uncertainty posed for the robots.
Description
Thesis (Ph.D.)--University of Washington, 2025
