Scherf, Lisa ; Schmidt, Aljoscha ; Pal, Suman ; Koert, Dorothea (2023)
Interactively learning behavior trees from imperfect human demonstrations.
In: Frontiers in Robotics and AI, 10
doi: 10.3389/frobt.2023.1152595
Artikel, Bibliographie
Dies ist die neueste Version dieses Eintrags.
Kurzbeschreibung (Abstract)
Introduction: In Interactive Task Learning (ITL), an agent learns a new task through natural interaction with a human instructor. Behavior Trees (BTs) offer a reactive, modular, and interpretable way of encoding task descriptions but have not yet been applied a lot in robotic ITL settings. Most existing approaches that learn a BT from human demonstrations require the user to specify each action step-by-step or do not allow for adapting a learned BT without the need to repeat the entire teaching process from scratch.
Method: We propose a new framework to directly learn a BT from only a few human task demonstrations recorded as RGB-D video streams. We automatically extract continuous pre- and post-conditions for BT action nodes from visual features and use a Backchaining approach to build a reactive BT. In a user study on how non-experts provide and vary demonstrations, we identify three common failure cases of an BT learned from potentially imperfect initial human demonstrations. We offer a way to interactively resolve these failure cases by refining the existing BT through interaction with a user over a web-interface. Specifically, failure cases or unknown states are detected automatically during the execution of a learned BT and the initial BT is adjusted or extended according to the provided user input.
Evaluation and results: We evaluate our approach on a robotic trash disposal task with 20 human participants and demonstrate that our method is capable of learning reactive BTs from only a few human demonstrations and interactively resolving possible failure cases at runtime.
Typ des Eintrags: | Artikel |
---|---|
Erschienen: | 2023 |
Autor(en): | Scherf, Lisa ; Schmidt, Aljoscha ; Pal, Suman ; Koert, Dorothea |
Art des Eintrags: | Bibliographie |
Titel: | Interactively learning behavior trees from imperfect human demonstrations |
Sprache: | Englisch |
Publikationsjahr: | 2023 |
Ort: | Darmstadt |
Verlag: | Frontiers Media S.A. |
Titel der Zeitschrift, Zeitung oder Schriftenreihe: | Frontiers in Robotics and AI |
Jahrgang/Volume einer Zeitschrift: | 10 |
Kollation: | 19 Seiten |
DOI: | 10.3389/frobt.2023.1152595 |
Zugehörige Links: | |
Kurzbeschreibung (Abstract): | Introduction: In Interactive Task Learning (ITL), an agent learns a new task through natural interaction with a human instructor. Behavior Trees (BTs) offer a reactive, modular, and interpretable way of encoding task descriptions but have not yet been applied a lot in robotic ITL settings. Most existing approaches that learn a BT from human demonstrations require the user to specify each action step-by-step or do not allow for adapting a learned BT without the need to repeat the entire teaching process from scratch. Method: We propose a new framework to directly learn a BT from only a few human task demonstrations recorded as RGB-D video streams. We automatically extract continuous pre- and post-conditions for BT action nodes from visual features and use a Backchaining approach to build a reactive BT. In a user study on how non-experts provide and vary demonstrations, we identify three common failure cases of an BT learned from potentially imperfect initial human demonstrations. We offer a way to interactively resolve these failure cases by refining the existing BT through interaction with a user over a web-interface. Specifically, failure cases or unknown states are detected automatically during the execution of a learned BT and the initial BT is adjusted or extended according to the provided user input. Evaluation and results: We evaluate our approach on a robotic trash disposal task with 20 human participants and demonstrate that our method is capable of learning reactive BTs from only a few human demonstrations and interactively resolving possible failure cases at runtime. |
Freie Schlagworte: | human-robot interaction, interactive task learning, behavior trees, learning from demonstration, robotic tasks, user studies, failure detection, failure recovery |
Sachgruppe der Dewey Dezimalklassifikatin (DDC): | 000 Allgemeines, Informatik, Informationswissenschaft > 004 Informatik |
Fachbereich(e)/-gebiet(e): | 20 Fachbereich Informatik 20 Fachbereich Informatik > Intelligente Autonome Systeme Zentrale Einrichtungen Zentrale Einrichtungen > Centre for Cognitive Science (CCS) |
Hinterlegungsdatum: | 02 Aug 2024 12:54 |
Letzte Änderung: | 02 Aug 2024 12:54 |
PPN: | |
Export: | |
Suche nach Titel in: | TUfind oder in Google |
Verfügbare Versionen dieses Eintrags
-
Interactively learning behavior trees from imperfect human demonstrations. (deposited 04 Aug 2023 12:04)
- Interactively learning behavior trees from imperfect human demonstrations. (deposited 02 Aug 2024 12:54) [Gegenwärtig angezeigt]
Frage zum Eintrag |
Optionen (nur für Redakteure)
Redaktionelle Details anzeigen |