Abstract. Indoor Unmanned Aerial Vehicles have often been tasked with performing SLAM, and the sensors most used in literature and industry have been cameras. Spanning from stereo to event cameras, visual algorithms have often been the de facto choice for localization. While visual SLAM has reached a high level of accuracy in localization, accurate map reconstruction still proves to be challenging. Meanwhile, LiDAR sensors have been used for years to obtain accurate maps. First in surveying applications and in the past ten years in the automotive sector. The weight, power, and size constraints of most traditional LiDARs have prevented their installation on UAVs for indoor use. MEMS-based LiDARs have already been used in UAVs but had to rely on algorithms designed to deal with their small FOV. Recently, a MEMS-based LiDAR with a wide field of view (360°*59°) and weighing 265 g has sparked interest in its potential for indoor UAV SLAM. We performed an extensive battery of tests in simulation environments to provide a first look into its effect on state-of-the-art SLAM algorithms, highlight which ones can provide the best results, and what improvements may be most beneficial. This paper aims to provide assistance in further research in the field by releasing the tool used for this work.
|Name||The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences|
- Assistive and Autonomous Systems