TU Darmstadt / ULB / TUbiblio

Measuring Pedestrians’ Gap Acceptance When Interacting with Vehicles - A Human Gait Oriented Approach

Theobald, Nina ; Joisten, Philip ; Abendroth, Bettina (2022)
Measuring Pedestrians’ Gap Acceptance When Interacting with Vehicles - A Human Gait Oriented Approach.
24th International Conference on Human-Computer Interaction, HCII 2022. Virtual Event (26.06.-01.07.2022)
doi: 10.26083/tuprints-00021864
Konferenzveröffentlichung, Zweitveröffentlichung, Postprint

Kurzbeschreibung (Abstract)

A significant variable describing the pedestrians’ behavior when interacting with vehicles is gap acceptance, which is the pedestrians’ choice of temporal and spatial gaps when crossing in front of vehicles. After a review of relevant approaches to measure gap acceptance used in studies, this paper presents a novel approach, which is suitable for the usage in field experiments and allows a natural crossing behavior of subjects. In particular, following a detailed analysis of forces exerted during human gait, an algorithm was developed that is capable of identifying the accurate temporal point at which subjects start crossing as the basis for calculating gap acceptance. Pretest results show the system’s stability and reliability as well as the gait algorithm’s robustness in determining the correct gap acceptance value. The human gait oriented approach can serve as a basis for designing interaction processes between pedestrians and automated vehicles that are a focus of current research efforts.

Typ des Eintrags: Konferenzveröffentlichung
Erschienen: 2022
Autor(en): Theobald, Nina ; Joisten, Philip ; Abendroth, Bettina
Art des Eintrags: Zweitveröffentlichung
Titel: Measuring Pedestrians’ Gap Acceptance When Interacting with Vehicles - A Human Gait Oriented Approach
Sprache: Englisch
Publikationsjahr: 2022
Ort: Darmstadt
Publikationsdatum der Erstveröffentlichung: 2022
Verlag: Springer
Buchtitel: HCI International 2022 Posters
Reihe: Communications in Computer and Information Science
Band einer Reihe: 1583
Kollation: 8 Seiten
Veranstaltungstitel: 24th International Conference on Human-Computer Interaction, HCII 2022
Veranstaltungsort: Virtual Event
Veranstaltungsdatum: 26.06.-01.07.2022
DOI: 10.26083/tuprints-00021864
URL / URN: https://tuprints.ulb.tu-darmstadt.de/21864
Zugehörige Links:
Herkunft: Zweitveröffentlichungsservice
Kurzbeschreibung (Abstract):

A significant variable describing the pedestrians’ behavior when interacting with vehicles is gap acceptance, which is the pedestrians’ choice of temporal and spatial gaps when crossing in front of vehicles. After a review of relevant approaches to measure gap acceptance used in studies, this paper presents a novel approach, which is suitable for the usage in field experiments and allows a natural crossing behavior of subjects. In particular, following a detailed analysis of forces exerted during human gait, an algorithm was developed that is capable of identifying the accurate temporal point at which subjects start crossing as the basis for calculating gap acceptance. Pretest results show the system’s stability and reliability as well as the gait algorithm’s robustness in determining the correct gap acceptance value. The human gait oriented approach can serve as a basis for designing interaction processes between pedestrians and automated vehicles that are a focus of current research efforts.

Freie Schlagworte: Gap acceptance, Pedestrian, Vehicle, Human gait
Status: Postprint
URN: urn:nbn:de:tuda-tuprints-218647
Sachgruppe der Dewey Dezimalklassifikatin (DDC): 300 Sozialwissenschaften > 380 Handel, Kommunikation, Verkehr
600 Technik, Medizin, angewandte Wissenschaften > 620 Ingenieurwissenschaften und Maschinenbau
Fachbereich(e)/-gebiet(e): 16 Fachbereich Maschinenbau
16 Fachbereich Maschinenbau > Institut für Arbeitswissenschaft (IAD)
Hinterlegungsdatum: 09 Sep 2022 13:42
Letzte Änderung: 27 Okt 2023 10:33
PPN:
Zugehörige Links:
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen