TU Darmstadt / ULB / TUbiblio

Video-Based Demonstration: Hands-Free/Eyes-Free Support for Mobile Workers

Mühlhäuser, Max and Schnelle-Walka, Dirk and Aitenbichler, Erwin and Kangasharju, Jussi (2005):
Video-Based Demonstration: Hands-Free/Eyes-Free Support for Mobile Workers.
In: Proc. 2nd Annual International Conference on Mobile and Ubiquitous Systems, [Other]

Abstract

One of the focus areas of our ubiquitous computing research is context aware support for people on the move, in particular for mobile workers who need hands and eyes free. "Mobile" here refers to everything except "sitting at a PC": it includes, e.g., workers at assembly lines or in laboratories. In order to understand our focus, one may imagine an employee carrying out a manual activity as part of her job description; further, one has to imagine an environment equipped with smart items (tags, sensors, …). Based on process models, enterprise integration software, and the context awareness provided, applications can i) "understand" the employees’ tasks, actions, and needs; and ii) deliver appropriate information / instructions to their local device. As such an application can probably never foresee the needs of the user with 100% accuracy, a mixed initiative approach is desirable, where the user can at any time access arbitrary pertinent information in a kind of "audio browsing" mode, see below. In fact, our prototypes put the users in control of the interface as much as possible. Our studies have shown that audio support would be the preferred solution over the often-cited augmented reality, head-mounted display based support, for many application areas today.

Item Type: Other
Erschienen: 2005
Creators: Mühlhäuser, Max and Schnelle-Walka, Dirk and Aitenbichler, Erwin and Kangasharju, Jussi
Title: Video-Based Demonstration: Hands-Free/Eyes-Free Support for Mobile Workers
Language: German
Abstract:

One of the focus areas of our ubiquitous computing research is context aware support for people on the move, in particular for mobile workers who need hands and eyes free. "Mobile" here refers to everything except "sitting at a PC": it includes, e.g., workers at assembly lines or in laboratories. In order to understand our focus, one may imagine an employee carrying out a manual activity as part of her job description; further, one has to imagine an environment equipped with smart items (tags, sensors, …). Based on process models, enterprise integration software, and the context awareness provided, applications can i) "understand" the employees’ tasks, actions, and needs; and ii) deliver appropriate information / instructions to their local device. As such an application can probably never foresee the needs of the user with 100% accuracy, a mixed initiative approach is desirable, where the user can at any time access arbitrary pertinent information in a kind of "audio browsing" mode, see below. In fact, our prototypes put the users in control of the interface as much as possible. Our studies have shown that audio support would be the preferred solution over the often-cited augmented reality, head-mounted display based support, for many application areas today.

Title of Book: Proc. 2nd Annual International Conference on Mobile and Ubiquitous Systems
Uncontrolled Keywords: - TNT - Area Talk and Touch Interaction;- TNT: STAIRS
Divisions: 20 Department of Computer Science > Telecooperation
20 Department of Computer Science
Date Deposited: 31 Dec 2016 12:59
Identification Number: TUD-CS-2005-0003
Related URLs:
Export:
Suche nach Titel in: TUfind oder in Google

Optionen (nur für Redakteure)

View Item View Item