TU Darmstadt / ULB / TUbiblio

Interactively learning behavior trees from imperfect human demonstrations

Scherf, Lisa ; Schmidt, Aljoscha ; Pal, Suman ; Koert, Dorothea (2023)
Interactively learning behavior trees from imperfect human demonstrations.
In: Frontiers in Robotics and AI, 2023, 10
doi: 10.26083/tuprints-00024370
Artikel, Zweitveröffentlichung, Verlagsversion

WarnungEs ist eine neuere Version dieses Eintrags verfügbar.

Kurzbeschreibung (Abstract)

Introduction: In Interactive Task Learning (ITL), an agent learns a new task through natural interaction with a human instructor. Behavior Trees (BTs) offer a reactive, modular, and interpretable way of encoding task descriptions but have not yet been applied a lot in robotic ITL settings. Most existing approaches that learn a BT from human demonstrations require the user to specify each action step-by-step or do not allow for adapting a learned BT without the need to repeat the entire teaching process from scratch.

Method: We propose a new framework to directly learn a BT from only a few human task demonstrations recorded as RGB-D video streams. We automatically extract continuous pre- and post-conditions for BT action nodes from visual features and use a Backchaining approach to build a reactive BT. In a user study on how non-experts provide and vary demonstrations, we identify three common failure cases of an BT learned from potentially imperfect initial human demonstrations. We offer a way to interactively resolve these failure cases by refining the existing BT through interaction with a user over a web-interface. Specifically, failure cases or unknown states are detected automatically during the execution of a learned BT and the initial BT is adjusted or extended according to the provided user input.

Evaluation and results: We evaluate our approach on a robotic trash disposal task with 20 human participants and demonstrate that our method is capable of learning reactive BTs from only a few human demonstrations and interactively resolving possible failure cases at runtime.

Typ des Eintrags: Artikel
Erschienen: 2023
Autor(en): Scherf, Lisa ; Schmidt, Aljoscha ; Pal, Suman ; Koert, Dorothea
Art des Eintrags: Zweitveröffentlichung
Titel: Interactively learning behavior trees from imperfect human demonstrations
Sprache: Englisch
Publikationsjahr: 2023
Ort: Darmstadt
Publikationsdatum der Erstveröffentlichung: 2023
Verlag: Frontiers Media S.A.
Titel der Zeitschrift, Zeitung oder Schriftenreihe: Frontiers in Robotics and AI
Jahrgang/Volume einer Zeitschrift: 10
Kollation: 19 Seiten
DOI: 10.26083/tuprints-00024370
URL / URN: https://tuprints.ulb.tu-darmstadt.de/24370
Zugehörige Links:
Herkunft: Zweitveröffentlichung DeepGreen
Kurzbeschreibung (Abstract):

Introduction: In Interactive Task Learning (ITL), an agent learns a new task through natural interaction with a human instructor. Behavior Trees (BTs) offer a reactive, modular, and interpretable way of encoding task descriptions but have not yet been applied a lot in robotic ITL settings. Most existing approaches that learn a BT from human demonstrations require the user to specify each action step-by-step or do not allow for adapting a learned BT without the need to repeat the entire teaching process from scratch.

Method: We propose a new framework to directly learn a BT from only a few human task demonstrations recorded as RGB-D video streams. We automatically extract continuous pre- and post-conditions for BT action nodes from visual features and use a Backchaining approach to build a reactive BT. In a user study on how non-experts provide and vary demonstrations, we identify three common failure cases of an BT learned from potentially imperfect initial human demonstrations. We offer a way to interactively resolve these failure cases by refining the existing BT through interaction with a user over a web-interface. Specifically, failure cases or unknown states are detected automatically during the execution of a learned BT and the initial BT is adjusted or extended according to the provided user input.

Evaluation and results: We evaluate our approach on a robotic trash disposal task with 20 human participants and demonstrate that our method is capable of learning reactive BTs from only a few human demonstrations and interactively resolving possible failure cases at runtime.

Freie Schlagworte: human-robot interaction, interactive task learning, behavior trees, learning from demonstration, robotic tasks, user studies, failure detection, failure recovery
Status: Verlagsversion
URN: urn:nbn:de:tuda-tuprints-243701
Sachgruppe der Dewey Dezimalklassifikatin (DDC): 000 Allgemeines, Informatik, Informationswissenschaft > 004 Informatik
Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Intelligente Autonome Systeme
Zentrale Einrichtungen
Zentrale Einrichtungen > Centre for Cognitive Science (CCS)
Hinterlegungsdatum: 04 Aug 2023 12:04
Letzte Änderung: 07 Aug 2023 09:46
PPN:
Export:
Suche nach Titel in: TUfind oder in Google

Verfügbare Versionen dieses Eintrags

Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen