TU Darmstadt / ULB / TUbiblio

Autonomous Assistance for Versatile Grasping with Rescue Robots

Schnaubelt, Marius ; Kohlbrecher, Stefan ; Stryk, Oskar von (2019):
Autonomous Assistance for Versatile Grasping with Rescue Robots.
pp. 210-215, IEEE, International Symposium on Safety, Security, and Rescue Robotics (SSRR 2019), Würzburg, Germany, 02.-04.09., ISBN 978-1-7281-0779-0,
DOI: 10.1109/SSRR.2019.8848947,
[Conference or Workshop Item]

Abstract

The deployment of mobile robots in urban search and rescue (USAR) scenarios often requires manipulation abilities, for example, for clearing debris or opening a door. Conventional teleoperated control of mobile manipulator arms with a high number of degrees of freedom in unknown and unstructured environments is highly challenging and error-prone. Thus, flexible semi-autonomous manipulation capabilities promise valuable support to the operator and possibly also prevent failures during missions. However, most existing approaches are not flexible enough as, e.g., they either assume a-priori known objects or object classes or require manual selection of grasp poses. In this paper, an approach is presented that combines a segmented 3D model of the scene with grasp pose detection. It enables grasping arbitrary rigid objects based on a geometric segmentation approach that divides the scene into objects. Antipodal grasp candidates sampled by the grasp pose detection are ranked to ensure a robust grasp. The human remotely operating the robot is able to control the grasping process using two short interactions in the user interface. Our real robot experiments demonstrate the capability to grasp various objects in cluttered environments.

Item Type: Conference or Workshop Item
Erschienen: 2019
Creators: Schnaubelt, Marius ; Kohlbrecher, Stefan ; Stryk, Oskar von
Title: Autonomous Assistance for Versatile Grasping with Rescue Robots
Language: English
Abstract:

The deployment of mobile robots in urban search and rescue (USAR) scenarios often requires manipulation abilities, for example, for clearing debris or opening a door. Conventional teleoperated control of mobile manipulator arms with a high number of degrees of freedom in unknown and unstructured environments is highly challenging and error-prone. Thus, flexible semi-autonomous manipulation capabilities promise valuable support to the operator and possibly also prevent failures during missions. However, most existing approaches are not flexible enough as, e.g., they either assume a-priori known objects or object classes or require manual selection of grasp poses. In this paper, an approach is presented that combines a segmented 3D model of the scene with grasp pose detection. It enables grasping arbitrary rigid objects based on a geometric segmentation approach that divides the scene into objects. Antipodal grasp candidates sampled by the grasp pose detection are ranked to ensure a robust grasp. The human remotely operating the robot is able to control the grasping process using two short interactions in the user interface. Our real robot experiments demonstrate the capability to grasp various objects in cluttered environments.

Publisher: IEEE
ISBN: 978-1-7281-0779-0
Divisions: 20 Department of Computer Science
20 Department of Computer Science > Simulation, Systems Optimization and Robotics Group
TU-Projects: Bund/BMBF|13N14861|A-DRZ
Event Title: International Symposium on Safety, Security, and Rescue Robotics (SSRR 2019)
Event Location: Würzburg, Germany
Event Dates: 02.-04.09.
Date Deposited: 02 Dec 2020 12:50
DOI: 10.1109/SSRR.2019.8848947
Export:
Suche nach Titel in: TUfind oder in Google
Send an inquiry Send an inquiry

Options (only for editors)
Show editorial Details Show editorial Details