3D Mobile Mapping of the Environment using Imaging Radar Sensors

Philipp Glira, Christoph Weidinger, Thomas Kadiofsky, Wolfgang Pointner, Katharina Olsbock, Christian Zinner, Masrur Doostdar

Research output: Poster presentation without proceedingspeer-review

Abstract

For 3D sensing of the environment, lidar sensors and stereo cameras are mostly used. These sensors work best in an ideal environment, i.e. with clear visibility. Radar, however, is widely unaffected by external influences (like rain, snow, fog, or dust) due to its longer wavelength, e.g. 4mm@77GHz. In this work we investigate the capabilities of an FM CW imaging radar sensor mounted on a mobile platform for 3D topographic mapping. We describe our radar processing pipeline and introduce thereby a new method to extract and model radar targets from radar images and a new method to estimate the extrinsic calibration of radar sensors. We demonstrate these developments by generating a radar-based 3D point cloud and multi-layer grid map of a quarry.
Original languageEnglish
DOIs
Publication statusPublished - 2022

Research Field

  • Assistive and Autonomous Systems

Keywords

  • calibration
  • mapping
  • robotics
  • target modeling

Fingerprint

Dive into the research topics of '3D Mobile Mapping of the Environment using Imaging Radar Sensors'. Together they form a unique fingerprint.

Cite this