TU Darmstadt / ULB / TUbiblio

Possible applications for gestures while driving

Zöller, Ilka ; Bechmann, Roman ; Abendroth, Bettina (2023)
Possible applications for gestures while driving.
In: Automotive and Engine Technology, 2018, 3 (1-2)
doi: 10.26083/tuprints-00024591
Artikel, Zweitveröffentlichung, Postprint

WarnungEs ist eine neuere Version dieses Eintrags verfügbar.

Kurzbeschreibung (Abstract)

The ongoing trend of integrating new comfort and entertainment functionality into cars is leading to an increase in the number of controls operated by the driver. In order to position as much functionality as possible within optimum reach of the driver, complex control designs such as rotary buttons and touchscreens are increasingly being implemented to guide the driver through multi-level menus. However, this also increases the mental effort, hand-eye coordination, visual distraction, and time required to operate these controls. Touch-free gesture control represents an innovative control design with the potential to improve the efficiency of human-machine interactions while driving. The Institute of Ergonomics & Human Factors (IAD) at the Technische Universität Darmstadt is currently researching which gestures could theoretically be implemented, based on the criteria of distinguishability, intuitiveness, and efficiency of operation. This paper presents the first results of a preliminary study, as well as a comprehensive catalogue of gestures that meet the criteria identified thus far that should be met by any efficient gesture-based control design.

Typ des Eintrags: Artikel
Erschienen: 2023
Autor(en): Zöller, Ilka ; Bechmann, Roman ; Abendroth, Bettina
Art des Eintrags: Zweitveröffentlichung
Titel: Possible applications for gestures while driving
Sprache: Englisch
Publikationsjahr: 24 Oktober 2023
Ort: Darmstadt
Publikationsdatum der Erstveröffentlichung: 2018
Ort der Erstveröffentlichung: Cham
Verlag: Springer
Titel der Zeitschrift, Zeitung oder Schriftenreihe: Automotive and Engine Technology
Jahrgang/Volume einer Zeitschrift: 3
(Heft-)Nummer: 1-2
Kollation: 10 Seiten
DOI: 10.26083/tuprints-00024591
URL / URN: https://tuprints.ulb.tu-darmstadt.de/24591
Zugehörige Links:
Herkunft: Zweitveröffentlichungsservice
Kurzbeschreibung (Abstract):

The ongoing trend of integrating new comfort and entertainment functionality into cars is leading to an increase in the number of controls operated by the driver. In order to position as much functionality as possible within optimum reach of the driver, complex control designs such as rotary buttons and touchscreens are increasingly being implemented to guide the driver through multi-level menus. However, this also increases the mental effort, hand-eye coordination, visual distraction, and time required to operate these controls. Touch-free gesture control represents an innovative control design with the potential to improve the efficiency of human-machine interactions while driving. The Institute of Ergonomics & Human Factors (IAD) at the Technische Universität Darmstadt is currently researching which gestures could theoretically be implemented, based on the criteria of distinguishability, intuitiveness, and efficiency of operation. This paper presents the first results of a preliminary study, as well as a comprehensive catalogue of gestures that meet the criteria identified thus far that should be met by any efficient gesture-based control design.

Freie Schlagworte: Driving, Human-machine-interface, Gesture control, Gaze aversion
Status: Postprint
URN: urn:nbn:de:tuda-tuprints-245911
Sachgruppe der Dewey Dezimalklassifikatin (DDC): 600 Technik, Medizin, angewandte Wissenschaften > 620 Ingenieurwissenschaften und Maschinenbau
Fachbereich(e)/-gebiet(e): 16 Fachbereich Maschinenbau
16 Fachbereich Maschinenbau > Institut für Arbeitswissenschaft (IAD)
Hinterlegungsdatum: 24 Okt 2023 09:14
Letzte Änderung: 26 Okt 2023 06:26
PPN:
Export:
Suche nach Titel in: TUfind oder in Google

Verfügbare Versionen dieses Eintrags

Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen