TU Darmstadt / ULB / TUbiblio

Learning Sequential Force Interaction Skills

Manschitz, Simon ; Gienger, Michael ; Kober, Jens ; Peters, Jan (2024)
Learning Sequential Force Interaction Skills.
In: Robotics, 2020, 9 (2)
doi: 10.26083/tuprints-00016992
Artikel, Zweitveröffentlichung, Verlagsversion

WarnungEs ist eine neuere Version dieses Eintrags verfügbar.

Kurzbeschreibung (Abstract)

Learning skills from kinesthetic demonstrations is a promising way of minimizing the gap between human manipulation abilities and those of robots. We propose an approach to learn sequential force interaction skills from such demonstrations. The demonstrations are decomposed into a set of movement primitives by inferring the underlying sequential structure of the task. The decomposition is based on a novel probability distribution which we call Directional Normal Distribution. The distribution allows infering the movement primitive’s composition, i.e., its coordinate frames, control variables and target coordinates from the demonstrations. In addition, it permits determining an appropriate number of movement primitives for a task via model selection. After finding the task’s composition, the system learns to sequence the resulting movement primitives in order to be able to reproduce the task on a real robot. We evaluate the approach on three different tasks, unscrewing a light bulb, box stacking and box flipping. All tasks are kinesthetically demonstrated and then reproduced on a Barrett WAM robot.

Typ des Eintrags: Artikel
Erschienen: 2024
Autor(en): Manschitz, Simon ; Gienger, Michael ; Kober, Jens ; Peters, Jan
Art des Eintrags: Zweitveröffentlichung
Titel: Learning Sequential Force Interaction Skills
Sprache: Englisch
Publikationsjahr: 15 Januar 2024
Ort: Darmstadt
Publikationsdatum der Erstveröffentlichung: 2020
Ort der Erstveröffentlichung: Basel
Verlag: MDPI
Titel der Zeitschrift, Zeitung oder Schriftenreihe: Robotics
Jahrgang/Volume einer Zeitschrift: 9
(Heft-)Nummer: 2
Kollation: 30 Seiten
DOI: 10.26083/tuprints-00016992
URL / URN: https://tuprints.ulb.tu-darmstadt.de/16992
Zugehörige Links:
Herkunft: Zweitveröffentlichung DeepGreen
Kurzbeschreibung (Abstract):

Learning skills from kinesthetic demonstrations is a promising way of minimizing the gap between human manipulation abilities and those of robots. We propose an approach to learn sequential force interaction skills from such demonstrations. The demonstrations are decomposed into a set of movement primitives by inferring the underlying sequential structure of the task. The decomposition is based on a novel probability distribution which we call Directional Normal Distribution. The distribution allows infering the movement primitive’s composition, i.e., its coordinate frames, control variables and target coordinates from the demonstrations. In addition, it permits determining an appropriate number of movement primitives for a task via model selection. After finding the task’s composition, the system learns to sequence the resulting movement primitives in order to be able to reproduce the task on a real robot. We evaluate the approach on three different tasks, unscrewing a light bulb, box stacking and box flipping. All tasks are kinesthetically demonstrated and then reproduced on a Barrett WAM robot.

Freie Schlagworte: human-robot interaction, motor skill learning, learning from demonstration, behavioral cloning
Status: Verlagsversion
URN: urn:nbn:de:tuda-tuprints-169926
Zusätzliche Informationen:

This article belongs to the Special Issue Feature Papers 2020

Sachgruppe der Dewey Dezimalklassifikatin (DDC): 000 Allgemeines, Informatik, Informationswissenschaft > 004 Informatik
600 Technik, Medizin, angewandte Wissenschaften > 621.3 Elektrotechnik, Elektronik
Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Intelligente Autonome Systeme
Hinterlegungsdatum: 15 Jan 2024 14:03
Letzte Änderung: 18 Jan 2024 12:42
PPN:
Export:
Suche nach Titel in: TUfind oder in Google

Verfügbare Versionen dieses Eintrags

Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen