Learning agile flights in clustered environments for quadrotors from pixel information using differentiable simulators and novel sensors

None.

This project focuses on enabling high-speed, agile autonomous aerial robot navigation in visually degraded and environmentally challenging conditions. These include scenarios such as post-earthquake zones, underground exploration, anti-poaching missions, and night-time search and rescue operations where conventional sensors like LiDARs and cameras fail. To address these challenges, the project introduces a novel visuo-sonic sensing suite that combines low-power event cameras and ultrasound sensors, which excel in conditions like darkness, fog, smoke, and transparent environments. Leveraging differentiable physics-based learning, the system adapts control policies in real-time to unconventional sensor inputs, ensuring reliable operation in harsh and dynamically changing environments.

Previous