TU Darmstadt / ULB / TUbiblio

Multi-Sensor Fusion for the Mason Framework

Schnaubelt, Marius (2016):
Multi-Sensor Fusion for the Mason Framework.
Department of Computer Science (SIM), TU Darmstadt, [Bachelor Thesis]

Abstract

Nowadays, mobile robots are equipped with combinations of complementary sensor systems enabling the robot to perceive its surrounding environment. These sensor systems can for example be stereo vision systems, RGB-D Cameras and 3D or spinning 2D laser scanners, each providing different capabilities for environment sensing. A sufficient world model estimation is crucial for the robot’s ability to perform complex tasks such as footstep planning, collision avoidance, path planning or manipulation. In this thesis, the sensor fusion capability of the new sensor fusion framework Mason is developed. Mason is designed to be deployed on multi-sensor systems and is capable to fuse measurements from an arbitrary number of sensors in order to provide accurate and dense world models. In order to gain flexibility, the framework supports loading shared libraries during runtime to add functionality to the framework dynamically. For sensor fusion, the spatially hashed truncated signed distance function (TSDF) was chosen, as it only stores curvatures of the environment and therefore reduces computational and memory consumption. The presented work is based on the OpenCHISEL library that was improved and integrated into Mason. The thesis investigates how to combine multiple local TSDF estimations from different sensors to a global TSDF representation. Afterwards, we demonstrate how different world model representations are created based on the TSDF data, for example an elevation map, tested in simulation with the multi-sensor head of the THOR-MANG humanoid robot.

Item Type: Bachelor Thesis
Erschienen: 2016
Creators: Schnaubelt, Marius
Title: Multi-Sensor Fusion for the Mason Framework
Language: English
Abstract:

Nowadays, mobile robots are equipped with combinations of complementary sensor systems enabling the robot to perceive its surrounding environment. These sensor systems can for example be stereo vision systems, RGB-D Cameras and 3D or spinning 2D laser scanners, each providing different capabilities for environment sensing. A sufficient world model estimation is crucial for the robot’s ability to perform complex tasks such as footstep planning, collision avoidance, path planning or manipulation. In this thesis, the sensor fusion capability of the new sensor fusion framework Mason is developed. Mason is designed to be deployed on multi-sensor systems and is capable to fuse measurements from an arbitrary number of sensors in order to provide accurate and dense world models. In order to gain flexibility, the framework supports loading shared libraries during runtime to add functionality to the framework dynamically. For sensor fusion, the spatially hashed truncated signed distance function (TSDF) was chosen, as it only stores curvatures of the environment and therefore reduces computational and memory consumption. The presented work is based on the OpenCHISEL library that was improved and integrated into Mason. The thesis investigates how to combine multiple local TSDF estimations from different sensors to a global TSDF representation. Afterwards, we demonstrate how different world model representations are created based on the TSDF data, for example an elevation map, tested in simulation with the multi-sensor head of the THOR-MANG humanoid robot.

Place of Publication: Department of Computer Science (SIM)
Divisions: 20 Department of Computer Science
20 Department of Computer Science > Simulation, Systems Optimization and Robotics Group
Date Deposited: 26 Jun 2019 07:50
Related URLs:
Export:
Suche nach Titel in: TUfind oder in Google
Send an inquiry Send an inquiry

Options (only for editors)

View Item View Item