Use Case VI

Machine Learning for Radar Applications

Date:
26.09.2023, 10:30 - 12:00
Duration:
90 minutes
Location:
H 0.01 + 0.02

Abstract

While its origins from the maritime, aerospace, and defense domains are well known, Radar systems implemented as highly integrated embedded sensors operating in the mm-Wave frequency bands are becoming more and more a ubiquitous sensing technology. Especially the high-volume automotive market for applications such as advanced driver assistance systems and highly-automated driving pushed the development of fully integrated and intelligent radar systems on a chip. Such integrated radar systems are now also available for general-purpose applications and enable advanced contactless sensing for, e.g., robotics navigation and obstacle detection, contactless vital parameter measurements, and smart-home and IoT applications. Radar ICs have even found their way into smartphones, e.g., for gesture recognition. To collect information about the environment, machine learning techniques are of large interest for radar applications. Applied at different stages in the radar signal processing chain, ML allows, e.g., for high-resolution parameter estimation, object classification, semantic segmentation of the environment, or improved target and signal detection. This lecture starts with an introduction to the architecture and fundamentals of modern mm-Wave radar sensors. A special focus is provided on the traditional signal processing chain to understand how electromagnetic signals in the mm-Wave domain can collect a vast amount of information about the surrounding environment in general or selected targets of interest. Then, the state-of-the-art application of machine learning techniques to boost radar system performance and enable advanced sensing functionality is discussed.

Prof. Markus Gardill

Chair of Electronic Systems and Sensors,
Brandenburg University of Technology Cottbus