Schwehr, Julian (2021)
Gaze Target Tracking for Driver Assistance Systems.
doi: 10.26083/tuprints-00018650
Buch, Zweitveröffentlichung, Verlagsversion
Es ist eine neuere Version dieses Eintrags verfügbar. |
Kurzbeschreibung (Abstract)
Despite many supporting systems, so-called advanced driver assistance systems (ADAS), human error is still by far the main cause of traffic accidents. In the development of new driver assistance concepts, systems and functions monitoring the driver while driving and classifying their behavior in the driving context are therefore increasingly coming to the fore. In this context, this dissertation deals with the question what the driver perceived in their environment. For this purpose, the information of the environment model has to be merged with measured gaze data. Given a precise calibration of the individual sensors, visual fixations of the driver on road users are modeled.
Based on the realization that simple geometric approaches cannot answer this question of visual fixations precisely enough, characteristics of human gaze behavior are identified and integrated as model knowledge into a probabilistic tracking approach. This tracking model considers every object which is classified as a dynamic object and thus as a potential road user by the vehicle's environment perception module as a possible hypothesis for the driver's current visual attention target. In addition, two different motion models of eye movements for fixations and saccades are integrated, so that the estimation of the gaze target can follow the special dynamics of human gaze and recognize specific connected time spans. The advantage of this novel resulting Multi-Hypothesis Multi-Model (MHMM) filter is the confidence which is characteristic to probabilistic approaches, indicating the probability of each object being fixated by the driver. A challenge is the evaluation of such new algorithms. For the statement which object the driver actually visually fixates, ground truth information is necessary. However, this cannot be covered by questionnaires. For this reason, a reference data set is created in which the recordings of the remote eye-tracking system installed in the vehicle are extended with the data of wearable eye-tracking glasses. With the help of these recordings, different model approaches are now compared on a quantitative and not only qualitative basis.
The prototypical City Assistant System, which was co-developed as part of this work, shows how the newly gained information about the driver's gaze behavior can be incorporated into new assistance concepts. It adapts its warning and recommendation cascade in urban intersection scenarios to the driver's driving style and gaze behavior. Through this orientation towards the driver's need for support, the City Assistant System contributes to higher acceptance of warning and recommending systems and ultimately to increased road safety.
Typ des Eintrags: | Buch | ||||
---|---|---|---|---|---|
Erschienen: | 2021 | ||||
Autor(en): | Schwehr, Julian | ||||
Art des Eintrags: | Zweitveröffentlichung | ||||
Titel: | Gaze Target Tracking for Driver Assistance Systems | ||||
Sprache: | Englisch | ||||
Referenten: | Adamy, Prof. Dr. Jürgen ; Winner, Prof. Dr. Hermann | ||||
Publikationsjahr: | 2021 | ||||
Ort: | Darmstadt | ||||
Publikationsdatum der Erstveröffentlichung: | 2020 | ||||
Kollation: | xxi, 210 Seiten | ||||
Datum der mündlichen Prüfung: | 15 Juli 2020 | ||||
DOI: | 10.26083/tuprints-00018650 | ||||
URL / URN: | https://tuprints.ulb.tu-darmstadt.de/18650 | ||||
Zugehörige Links: | |||||
Herkunft: | Zweitveröffentlichung | ||||
Kurzbeschreibung (Abstract): | Despite many supporting systems, so-called advanced driver assistance systems (ADAS), human error is still by far the main cause of traffic accidents. In the development of new driver assistance concepts, systems and functions monitoring the driver while driving and classifying their behavior in the driving context are therefore increasingly coming to the fore. In this context, this dissertation deals with the question what the driver perceived in their environment. For this purpose, the information of the environment model has to be merged with measured gaze data. Given a precise calibration of the individual sensors, visual fixations of the driver on road users are modeled. Based on the realization that simple geometric approaches cannot answer this question of visual fixations precisely enough, characteristics of human gaze behavior are identified and integrated as model knowledge into a probabilistic tracking approach. This tracking model considers every object which is classified as a dynamic object and thus as a potential road user by the vehicle's environment perception module as a possible hypothesis for the driver's current visual attention target. In addition, two different motion models of eye movements for fixations and saccades are integrated, so that the estimation of the gaze target can follow the special dynamics of human gaze and recognize specific connected time spans. The advantage of this novel resulting Multi-Hypothesis Multi-Model (MHMM) filter is the confidence which is characteristic to probabilistic approaches, indicating the probability of each object being fixated by the driver. A challenge is the evaluation of such new algorithms. For the statement which object the driver actually visually fixates, ground truth information is necessary. However, this cannot be covered by questionnaires. For this reason, a reference data set is created in which the recordings of the remote eye-tracking system installed in the vehicle are extended with the data of wearable eye-tracking glasses. With the help of these recordings, different model approaches are now compared on a quantitative and not only qualitative basis. The prototypical City Assistant System, which was co-developed as part of this work, shows how the newly gained information about the driver's gaze behavior can be incorporated into new assistance concepts. It adapts its warning and recommendation cascade in urban intersection scenarios to the driver's driving style and gaze behavior. Through this orientation towards the driver's need for support, the City Assistant System contributes to higher acceptance of warning and recommending systems and ultimately to increased road safety. |
||||
Alternatives oder übersetztes Abstract: |
|
||||
Status: | Verlagsversion | ||||
URN: | urn:nbn:de:tuda-tuprints-186509 | ||||
Zusätzliche Informationen: | Erscheint auch beim Shaker-Verlag in der Reihe "Berichte aus der Fahrzeugtechnik" unter der ISBN 978-3-8440-7702-5. |
||||
Sachgruppe der Dewey Dezimalklassifikatin (DDC): | 600 Technik, Medizin, angewandte Wissenschaften > 620 Ingenieurwissenschaften und Maschinenbau | ||||
Fachbereich(e)/-gebiet(e): | 18 Fachbereich Elektrotechnik und Informationstechnik 18 Fachbereich Elektrotechnik und Informationstechnik > Institut für Automatisierungstechnik und Mechatronik 18 Fachbereich Elektrotechnik und Informationstechnik > Institut für Automatisierungstechnik und Mechatronik > Regelungsmethoden und Robotik (ab 01.08.2022 umbenannt in Regelungsmethoden und Intelligente Systeme) |
||||
Hinterlegungsdatum: | 29 Jul 2021 08:13 | ||||
Letzte Änderung: | 03 Aug 2021 06:59 | ||||
PPN: | |||||
Referenten: | Adamy, Prof. Dr. Jürgen ; Winner, Prof. Dr. Hermann | ||||
Datum der mündlichen Prüfung / Verteidigung / mdl. Prüfung: | 15 Juli 2020 | ||||
Export: | |||||
Suche nach Titel in: | TUfind oder in Google |
Verfügbare Versionen dieses Eintrags
- Gaze Target Tracking for Driver Assistance Systems. (deposited 29 Jul 2021 08:13) [Gegenwärtig angezeigt]
Frage zum Eintrag |
Optionen (nur für Redakteure)
Redaktionelle Details anzeigen |