Lissermann, Roman ; Huber, Jochen ; Hadjakos, Aristotelis ; Nanayakkara, Suranga ; Mühlhäuser, Max (2014)
EarPut: Augmenting Ear-worn Devices for Ear-based Interaction.
doi: 10.1145/2686612.2686655
Konferenzveröffentlichung, Bibliographie
Kurzbeschreibung (Abstract)
One of the pervasive challenges in mobile interaction is decreasing the visual demand of interfaces towards eyes-free interaction. In this paper, we focus on the unique affordances of the human ear to support one-handed and eyes-free mobile interaction. We present EarPut, a novel interface concept and hardware prototype, which unobtrusively augments a variety of accessories that are worn behind the ear (e.g. headsets or glasses) to instrument the human ear as an interactive surface. The contribution of this paper is three-fold. We contribute (i) results from a controlled experiment with 27 participants, providing empirical evidence that people are able to target salient regions on their ear effectively and precisely, (ii) a first, systematically derived design space for ear-based interaction and (iii) a set of proof of concept EarPut applications that leverage on the design space and embrace mobile media navigation, mobile gaming and smart home interaction.
Typ des Eintrags: | Konferenzveröffentlichung |
---|---|
Erschienen: | 2014 |
Autor(en): | Lissermann, Roman ; Huber, Jochen ; Hadjakos, Aristotelis ; Nanayakkara, Suranga ; Mühlhäuser, Max |
Art des Eintrags: | Bibliographie |
Titel: | EarPut: Augmenting Ear-worn Devices for Ear-based Interaction |
Sprache: | Deutsch |
Publikationsjahr: | 2014 |
Buchtitel: | Proceedings of the 26th Australian Computer-Human Interaction Conference on Designing Futures - the Future of Design, {OZCHI} '14, Sydney, New South Wales, Australia, December 2-5, 2014 |
DOI: | 10.1145/2686612.2686655 |
Zugehörige Links: | |
Kurzbeschreibung (Abstract): | One of the pervasive challenges in mobile interaction is decreasing the visual demand of interfaces towards eyes-free interaction. In this paper, we focus on the unique affordances of the human ear to support one-handed and eyes-free mobile interaction. We present EarPut, a novel interface concept and hardware prototype, which unobtrusively augments a variety of accessories that are worn behind the ear (e.g. headsets or glasses) to instrument the human ear as an interactive surface. The contribution of this paper is three-fold. We contribute (i) results from a controlled experiment with 27 participants, providing empirical evidence that people are able to target salient regions on their ear effectively and precisely, (ii) a first, systematically derived design space for ear-based interaction and (iii) a set of proof of concept EarPut applications that leverage on the design space and embrace mobile media navigation, mobile gaming and smart home interaction. |
Freie Schlagworte: | - TI: Interactive Surfaces;Computer Vision |
ID-Nummer: | TUD-CS-2014-0916 |
Fachbereich(e)/-gebiet(e): | 20 Fachbereich Informatik 20 Fachbereich Informatik > Modellierung und Analyse von Informationssystemen (MAIS) 20 Fachbereich Informatik > Telekooperation |
Hinterlegungsdatum: | 31 Dez 2016 12:59 |
Letzte Änderung: | 14 Jun 2021 06:14 |
PPN: | |
Export: | |
Suche nach Titel in: | TUfind oder in Google |
Frage zum Eintrag |
Optionen (nur für Redakteure)
Redaktionelle Details anzeigen |