TU Darmstadt / ULB / TUbiblio

Autonomous Assistance for Versatile Grasping with Rescue Robots

Schnaubelt, Marius ; Kohlbrecher, Stefan ; Stryk, Oskar von (2019)
Autonomous Assistance for Versatile Grasping with Rescue Robots.
International Symposium on Safety, Security, and Rescue Robotics (SSRR 2019). Würzburg, Germany (02.-04.09.)
doi: 10.1109/SSRR.2019.8848947
Konferenzveröffentlichung, Bibliographie

Kurzbeschreibung (Abstract)

The deployment of mobile robots in urban search and rescue (USAR) scenarios often requires manipulation abilities, for example, for clearing debris or opening a door. Conventional teleoperated control of mobile manipulator arms with a high number of degrees of freedom in unknown and unstructured environments is highly challenging and error-prone. Thus, flexible semi-autonomous manipulation capabilities promise valuable support to the operator and possibly also prevent failures during missions. However, most existing approaches are not flexible enough as, e.g., they either assume a-priori known objects or object classes or require manual selection of grasp poses. In this paper, an approach is presented that combines a segmented 3D model of the scene with grasp pose detection. It enables grasping arbitrary rigid objects based on a geometric segmentation approach that divides the scene into objects. Antipodal grasp candidates sampled by the grasp pose detection are ranked to ensure a robust grasp. The human remotely operating the robot is able to control the grasping process using two short interactions in the user interface. Our real robot experiments demonstrate the capability to grasp various objects in cluttered environments.

Typ des Eintrags: Konferenzveröffentlichung
Erschienen: 2019
Autor(en): Schnaubelt, Marius ; Kohlbrecher, Stefan ; Stryk, Oskar von
Art des Eintrags: Bibliographie
Titel: Autonomous Assistance for Versatile Grasping with Rescue Robots
Sprache: Englisch
Publikationsjahr: 26 September 2019
Verlag: IEEE
Veranstaltungstitel: International Symposium on Safety, Security, and Rescue Robotics (SSRR 2019)
Veranstaltungsort: Würzburg, Germany
Veranstaltungsdatum: 02.-04.09.
DOI: 10.1109/SSRR.2019.8848947
Kurzbeschreibung (Abstract):

The deployment of mobile robots in urban search and rescue (USAR) scenarios often requires manipulation abilities, for example, for clearing debris or opening a door. Conventional teleoperated control of mobile manipulator arms with a high number of degrees of freedom in unknown and unstructured environments is highly challenging and error-prone. Thus, flexible semi-autonomous manipulation capabilities promise valuable support to the operator and possibly also prevent failures during missions. However, most existing approaches are not flexible enough as, e.g., they either assume a-priori known objects or object classes or require manual selection of grasp poses. In this paper, an approach is presented that combines a segmented 3D model of the scene with grasp pose detection. It enables grasping arbitrary rigid objects based on a geometric segmentation approach that divides the scene into objects. Antipodal grasp candidates sampled by the grasp pose detection are ranked to ensure a robust grasp. The human remotely operating the robot is able to control the grasping process using two short interactions in the user interface. Our real robot experiments demonstrate the capability to grasp various objects in cluttered environments.

Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Simulation, Systemoptimierung und Robotik
TU-Projekte: Bund/BMBF|13N14861|A-DRZ
Hinterlegungsdatum: 02 Dez 2020 12:50
Letzte Änderung: 16 Aug 2021 07:39
PPN:
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen