TU Darmstadt / ULB / TUbiblio

Robot-aided Pointing Gesture Recognition in Intelligent Environments

Prediger, Mark (2013)
Robot-aided Pointing Gesture Recognition in Intelligent Environments.
Technische Universität Darmstadt
Bachelorarbeit, Bibliographie

Kurzbeschreibung (Abstract)

In research field of Intelligent Environments explicit control of contained devices is a major topic. Hand and arm gestures represent an intuitive way to select and control devices and can be gathered by inexpensive depth cameras like the Microsoft Kinect. But for covering multiple rooms several depth cameras are required, resulting in increasing costs and complexity of such a system in larger home areas. To face this problem we use a mobile robot to permanently follow and track the user. The user can simply point from any place at a device in the environment with his arm to control it. Therefore the robot is equipped with a consumer depth camera used to perform gesture recognition and self-localization for computing the global pointing direction. The pointing direction is communicated to the system of the Intelligent Environment which checks for intersection with known devices and changes their state. To prove the feasibility of our approach we realized a prototypical implementation using a real robot and evaluated it with a group of candidates. Finally we compared it with a gesture control system using multiple cameras focusing on user acceptance and costs.

Typ des Eintrags: Bachelorarbeit
Erschienen: 2013
Autor(en): Prediger, Mark
Art des Eintrags: Bibliographie
Titel: Robot-aided Pointing Gesture Recognition in Intelligent Environments
Sprache: Englisch
Publikationsjahr: 2013
Kurzbeschreibung (Abstract):

In research field of Intelligent Environments explicit control of contained devices is a major topic. Hand and arm gestures represent an intuitive way to select and control devices and can be gathered by inexpensive depth cameras like the Microsoft Kinect. But for covering multiple rooms several depth cameras are required, resulting in increasing costs and complexity of such a system in larger home areas. To face this problem we use a mobile robot to permanently follow and track the user. The user can simply point from any place at a device in the environment with his arm to control it. Therefore the robot is equipped with a consumer depth camera used to perform gesture recognition and self-localization for computing the global pointing direction. The pointing direction is communicated to the system of the Intelligent Environment which checks for intersection with known devices and changes their state. To prove the feasibility of our approach we realized a prototypical implementation using a real robot and evaluated it with a group of candidates. Finally we compared it with a gesture control system using multiple cameras focusing on user acceptance and costs.

Freie Schlagworte: Business Field: Digital society, Research Area: Confluence of graphics and vision, Robotics, Gesture based interaction, Ambient assisted living (AAL), Ambient intelligence (AmI)
Zusätzliche Informationen:

69 p.

Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Graphisch-Interaktive Systeme
Hinterlegungsdatum: 12 Nov 2018 11:16
Letzte Änderung: 12 Nov 2018 11:16
PPN:
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen