TU Darmstadt / ULB / TUbiblio

Narrative Use of Sign Language by a Virtual Character for the Hearing Impaired

Rieger, Thomas ; Braun, Norbert (2003):
Narrative Use of Sign Language by a Virtual Character for the Hearing Impaired.
In: Computer Graphics Forum, 22 (3), pp. 651-666. [Article]

Abstract

This paper describes the concept and control of a 3d virtual character system with facial expressions and gestures as a conversational user interface with narrative expressiveness for the hearing impaired. The gestures and facial expressions are based on morphing techniques. The system allows the generation of sign language and mouth motion in real time from text at the level of lip reading quality. The concept of Narrative Extended Speech Acts (NESA) is introduced, based on Interactive Storytelling techniques and the concepts of Narrative Conflict and Suspense Progression. We define a choice of annotation tags to be used with NESAs. We use the NESAs to classify conversation fragments and to enhance computer generated sign language. We note, how the sign language gestures are generated and show the possibilities for editing sign language gestures. Furthermore, we give details on how the NESAs are mapped to gestures. We show the possibilities of controlling the virtual character's behaviour and gestures in a human-oriented way and provide an outlook on future work.

Item Type: Article
Erschienen: 2003
Creators: Rieger, Thomas ; Braun, Norbert
Title: Narrative Use of Sign Language by a Virtual Character for the Hearing Impaired
Language: English
Abstract:

This paper describes the concept and control of a 3d virtual character system with facial expressions and gestures as a conversational user interface with narrative expressiveness for the hearing impaired. The gestures and facial expressions are based on morphing techniques. The system allows the generation of sign language and mouth motion in real time from text at the level of lip reading quality. The concept of Narrative Extended Speech Acts (NESA) is introduced, based on Interactive Storytelling techniques and the concepts of Narrative Conflict and Suspense Progression. We define a choice of annotation tags to be used with NESAs. We use the NESAs to classify conversation fragments and to enhance computer generated sign language. We note, how the sign language gestures are generated and show the possibilities for editing sign language gestures. Furthermore, we give details on how the NESAs are mapped to gestures. We show the possibilities of controlling the virtual character's behaviour and gestures in a human-oriented way and provide an outlook on future work.

Journal or Publication Title: Computer Graphics Forum
Journal volume: 22
Number: 3
Uncontrolled Keywords: Virtual characters, Virtual narrators, Narrative intelligence, Avatar behavior, Story engine
Divisions: 20 Department of Computer Science
20 Department of Computer Science > Interactive Graphics Systems
Date Deposited: 16 Apr 2018 09:04
Export:
Suche nach Titel in: TUfind oder in Google
Send an inquiry Send an inquiry

Options (only for editors)
Show editorial Details Show editorial Details