Vision and Control for Insect-scale Robots
Abstract
This dissertation focuses on advancing vision and control for insect-scale robots, tackling two critical challenges: hovering and navigation. For hovering, the first study, presented in the TinySense paper (Best Student Paper Award at ICRA 2025), develops a lightweight, energy-efficient avionics system, integrating a pressure sensor and a tiny global shutter camera for state estimation. Validated on the Crazyflie drone, this system demonstrates robust hovering state estimation performance comparable to the Crazyflie built-in state estimator and the ground truth captured by Mocap (Motion capture) system. The second study introduces a gyroscope-free, visual-inertial flight control system for 10 mg robots that combines an accelerometer and tiny optic flow camera for effective wind rejection, inspired by biological sensory fusion in fruit flies. This approach, detailed in our Science Robotics paper, significantly reduces weight and power consumption, enabling stable flight even under wind disturbances. For navigation, we first introduce a biologically inspired, bilinear optic flow approximation method, suitable for confined-space navigation in environments where traditional mapping techniques like SLAM are infeasible due to speed, size, weight, and power (SSWaP) constraints. This method leverages learned optic flow patterns for heading stability within a corridor, as outlined in our IROS paper, demonstrating confined-space navigation without reliance on a pre-mapped environment. Building on this foundation, we next leverage the emergence of embedded event cameras—whose characteristics align well with the sensing constraints of insect-scale robots—to tackle the challenge of navigation in confined spaces. We work on developing a deep reinforcement learning framework that enables vision-based navigation using onboard event-based vision data alone. This approach has potential to achieve autonomous flight for insect-scale robots in confined spaces without external localization or mapping. Together, these contributions advance the potential for fully autonomous, insect-scale robotic platforms operating in real-world conditions.
Description
Thesis (Ph.D.)--University of Washington, 2025
