TU Darmstadt / ULB / TUbiblio

Development of a Method for the Fusion of Environment Perception Sensor Data for an Automated Vehicle

Uecker, Marc (2021)
Development of a Method for the Fusion of Environment Perception Sensor Data for an Automated Vehicle.
Technische Universität Darmstadt
doi: 10.26083/tuprints-00018613
Masterarbeit, Erstveröffentlichung, Verlagsversion

Kurzbeschreibung (Abstract)

Autonomous driving is currently one of the most anticipated future technologies of the automotive world, and researchers from all over the world are dedicated to this task. In the same pursuit, the aDDa project at TU Darmstadt is a collaboration of researchers and students, focused on jointly engineering a car into a fully autonomous vehicle. As such, the aDDa research vehicle is outfitted with a wide array of sensors for environment perception.

Within the scope of the aDDa project, this thesis covers the fusion of data from LIDAR, RADAR and camera sensors into a unified environment model. Specifically, this work focuses on providing real-time environment perception, including fusion and interpretation of data from different sensors using only on-board hardware resources.

The developed method is a software pipeline, consisting of an analytical low-level sensor fusion stage, a 3D semantic segmentation model based on deep learning, and analytical clustering and tracking methods, as well as a proof-of-concept for estimating drivable space. This method is designed to maximize robustness, by minimizing the influence of the used machine learning approach on the reliability of obstacle detection. The sensor fusion pipeline runs in real-time with an output frequency of 10 Hz, and a pipeline delay of 120 to 190 milliseconds in the encountered situations on public roads.

An evaluation of several scenarios shows that the developed system can reliably detect a target vehicle in a variety of real-world situations.

The full contributions of this work not only include the development of a sensor fusion pipeline, but also methods for sensor calibration, as well as a novel method for generating training data for the used machine learning approach. In contrast to existing manual methods of data annotation, this work presents a scalable solution for annotating real-world sensor recordings to generate training data for 3D machine perception approaches for autonomous driving.

Typ des Eintrags: Masterarbeit
Erschienen: 2021
Autor(en): Uecker, Marc
Art des Eintrags: Erstveröffentlichung
Titel: Development of a Method for the Fusion of Environment Perception Sensor Data for an Automated Vehicle
Sprache: Englisch
Referenten: Kuijper, Prof. Dr. Arjan ; Winner, Prof. Dr. Hermann ; Linnhoff, M.Sc. Clemens
Publikationsjahr: 2021
Ort: Darmstadt
Kollation: XI, 98 Seiten
DOI: 10.26083/tuprints-00018613
URL / URN: https://tuprints.ulb.tu-darmstadt.de/18613
Kurzbeschreibung (Abstract):

Autonomous driving is currently one of the most anticipated future technologies of the automotive world, and researchers from all over the world are dedicated to this task. In the same pursuit, the aDDa project at TU Darmstadt is a collaboration of researchers and students, focused on jointly engineering a car into a fully autonomous vehicle. As such, the aDDa research vehicle is outfitted with a wide array of sensors for environment perception.

Within the scope of the aDDa project, this thesis covers the fusion of data from LIDAR, RADAR and camera sensors into a unified environment model. Specifically, this work focuses on providing real-time environment perception, including fusion and interpretation of data from different sensors using only on-board hardware resources.

The developed method is a software pipeline, consisting of an analytical low-level sensor fusion stage, a 3D semantic segmentation model based on deep learning, and analytical clustering and tracking methods, as well as a proof-of-concept for estimating drivable space. This method is designed to maximize robustness, by minimizing the influence of the used machine learning approach on the reliability of obstacle detection. The sensor fusion pipeline runs in real-time with an output frequency of 10 Hz, and a pipeline delay of 120 to 190 milliseconds in the encountered situations on public roads.

An evaluation of several scenarios shows that the developed system can reliably detect a target vehicle in a variety of real-world situations.

The full contributions of this work not only include the development of a sensor fusion pipeline, but also methods for sensor calibration, as well as a novel method for generating training data for the used machine learning approach. In contrast to existing manual methods of data annotation, this work presents a scalable solution for annotating real-world sensor recordings to generate training data for 3D machine perception approaches for autonomous driving.

Status: Verlagsversion
URN: urn:nbn:de:tuda-tuprints-186134
Zusätzliche Informationen:

Keywords: sensor fusion, environment perception, machine perception, machine learning, object detection, semantic segmentation, pointclouds, point cloud, pointcloud, deep learning, computer vision, autonomous driving, LIDAR, RADAR, camera, calibration, sensor calibration, camera calibration, motion compensation, data annotation, training data, dataset, labeling

Sachgruppe der Dewey Dezimalklassifikatin (DDC): 000 Allgemeines, Informatik, Informationswissenschaft > 004 Informatik
600 Technik, Medizin, angewandte Wissenschaften > 620 Ingenieurwissenschaften und Maschinenbau
Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Graphisch-Interaktive Systeme
Hinterlegungsdatum: 29 Jun 2021 09:42
Letzte Änderung: 07 Jul 2021 07:26
PPN:
Referenten: Kuijper, Prof. Dr. Arjan ; Winner, Prof. Dr. Hermann ; Linnhoff, M.Sc. Clemens
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen