TU Darmstadt / ULB / TUbiblio

Multimodal Kinect-supported Interaction for the Visually Impaired

Gross, Richard (2012)
Multimodal Kinect-supported Interaction for the Visually Impaired.
Technische Universität Darmstadt
Masterarbeit, Bibliographie

Kurzbeschreibung (Abstract)

This thesis suggests a new computer interface, specifically targeted at blind and visually impaired people. We use the Microsoft Kinect to track a user's position and have implemented a novel spatial interface to control text-to-speech synthesis of a document. Which actions are executed is solely determined through hand movements in relation to the body. All feedback for the actions is given in auditory form, through synthesized speech or earcons. Earcons are brief, unique sounds that convey information. Visually impaired or blind users do not have to point or remember keyboard commands, but can use their proprioceptive sense to effectively explore documents and execute actions. The test results are encouraging. Even when participants found themselves lost they were always able to find their way back to an interface state they knew how to navigate. Furthermore, most negative feedback can be attributed to the current technical limitations and not the spatial interface itself.

Typ des Eintrags: Masterarbeit
Erschienen: 2012
Autor(en): Gross, Richard
Art des Eintrags: Bibliographie
Titel: Multimodal Kinect-supported Interaction for the Visually Impaired
Sprache: Englisch
Publikationsjahr: 2012
Kurzbeschreibung (Abstract):

This thesis suggests a new computer interface, specifically targeted at blind and visually impaired people. We use the Microsoft Kinect to track a user's position and have implemented a novel spatial interface to control text-to-speech synthesis of a document. Which actions are executed is solely determined through hand movements in relation to the body. All feedback for the actions is given in auditory form, through synthesized speech or earcons. Earcons are brief, unique sounds that convey information. Visually impaired or blind users do not have to point or remember keyboard commands, but can use their proprioceptive sense to effectively explore documents and execute actions. The test results are encouraging. Even when participants found themselves lost they were always able to find their way back to an interface state they knew how to navigate. Furthermore, most negative feedback can be attributed to the current technical limitations and not the spatial interface itself.

Freie Schlagworte: Business Field: Digital society, Research Area: Semantics in the modeling process, Intuitive interaction, Human-computer interaction (HCI), Physically based modeling, Realtime interaction, Gesture based interaction, Kinect
Zusätzliche Informationen:

92 p.

Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Graphisch-Interaktive Systeme
Hinterlegungsdatum: 12 Nov 2018 11:16
Letzte Änderung: 12 Nov 2018 11:16
PPN:
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen