TU Darmstadt / ULB / TUbiblio

Multi-Sensor Fusion for the Mason Framework

Schnaubelt, Marius (2016)
Multi-Sensor Fusion for the Mason Framework.
Technische Universität Darmstadt
Bachelorarbeit, Bibliographie

Kurzbeschreibung (Abstract)

Nowadays, mobile robots are equipped with combinations of complementary sensor systems enabling the robot to perceive its surrounding environment. These sensor systems can for example be stereo vision systems, RGB-D Cameras and 3D or spinning 2D laser scanners, each providing different capabilities for environment sensing. A sufficient world model estimation is crucial for the robot’s ability to perform complex tasks such as footstep planning, collision avoidance, path planning or manipulation. In this thesis, the sensor fusion capability of the new sensor fusion framework Mason is developed. Mason is designed to be deployed on multi-sensor systems and is capable to fuse measurements from an arbitrary number of sensors in order to provide accurate and dense world models. In order to gain flexibility, the framework supports loading shared libraries during runtime to add functionality to the framework dynamically. For sensor fusion, the spatially hashed truncated signed distance function (TSDF) was chosen, as it only stores curvatures of the environment and therefore reduces computational and memory consumption. The presented work is based on the OpenCHISEL library that was improved and integrated into Mason. The thesis investigates how to combine multiple local TSDF estimations from different sensors to a global TSDF representation. Afterwards, we demonstrate how different world model representations are created based on the TSDF data, for example an elevation map, tested in simulation with the multi-sensor head of the THOR-MANG humanoid robot.

Typ des Eintrags: Bachelorarbeit
Erschienen: 2016
Autor(en): Schnaubelt, Marius
Art des Eintrags: Bibliographie
Titel: Multi-Sensor Fusion for the Mason Framework
Sprache: Englisch
Publikationsjahr: 2016
Ort: Department of Computer Science (SIM)
Zugehörige Links:
Kurzbeschreibung (Abstract):

Nowadays, mobile robots are equipped with combinations of complementary sensor systems enabling the robot to perceive its surrounding environment. These sensor systems can for example be stereo vision systems, RGB-D Cameras and 3D or spinning 2D laser scanners, each providing different capabilities for environment sensing. A sufficient world model estimation is crucial for the robot’s ability to perform complex tasks such as footstep planning, collision avoidance, path planning or manipulation. In this thesis, the sensor fusion capability of the new sensor fusion framework Mason is developed. Mason is designed to be deployed on multi-sensor systems and is capable to fuse measurements from an arbitrary number of sensors in order to provide accurate and dense world models. In order to gain flexibility, the framework supports loading shared libraries during runtime to add functionality to the framework dynamically. For sensor fusion, the spatially hashed truncated signed distance function (TSDF) was chosen, as it only stores curvatures of the environment and therefore reduces computational and memory consumption. The presented work is based on the OpenCHISEL library that was improved and integrated into Mason. The thesis investigates how to combine multiple local TSDF estimations from different sensors to a global TSDF representation. Afterwards, we demonstrate how different world model representations are created based on the TSDF data, for example an elevation map, tested in simulation with the multi-sensor head of the THOR-MANG humanoid robot.

Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Simulation, Systemoptimierung und Robotik
Hinterlegungsdatum: 26 Jun 2019 07:50
Letzte Änderung: 26 Jun 2019 07:50
PPN:
Zugehörige Links:
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen