TU Darmstadt / ULB / TUbiblio

Evidence for human-centric in-vehicle lighting: Part 2 — Modeling illumination based on color-opponents

Weirich, Christopher ; Lin, Yandan ; Khanh, Tran Quoc (2022)
Evidence for human-centric in-vehicle lighting: Part 2 — Modeling illumination based on color-opponents.
In: Frontiers in Neuroscience, 2022, 16
doi: 10.26083/tuprints-00022498
Artikel, Zweitveröffentlichung, Verlagsversion

Kurzbeschreibung (Abstract)

Illumination preference models are usually defined in a static scenery, rating common-colored objects by a single scale or semantic differentials. Recently, it was reported that two to three illumination characteristics are necessary to define a high correlation in a bright office-like environment. However, white-light illumination preferences for vehicle-occupants in a dynamic semi- to full automated modern driving context are missing. Here we conducted a global free access online survey using VR engines to create 360° sRGB static in-vehicle sceneries. A total of 164 participants from China and Europe answered three levels in our self-hosted questionnaire by using mobile access devices. First, the absolute perceptional difference should be defined by a variation of CCT for 3,000, 4,500, and 6,000 K or combinations, and light distribution, either in a spot- or spatial way. Second, psychological light attributes should be associated with the same illumination and scenery settings. Finally, we created four driving environments with varying external levels of interest and time of the day. We identified three key results: (1) Four illumination groups could be classified by applying nMDS. (2) Combinations of mixed CCTs and spatial light distributions outperformed compared single light settings (p < 0.05), suggesting that also during daylight conditions artificial light supplements are necessary. (3) By an image transformation in the IPT and CAM16 color appearance space, comparing external and in-vehicle scenery, individual illumination working areas for each driving scenery could be identified, especially in the dimension of chroma-, partially following the Hunt-Effect, and lightness contrast, which synchronizes the internal and external brightness level. We classified our results as a starting point, which we intend to prove in a follow-up-controlled laboratory study with real object arrangements. Also, by applying novel methods to display high fidelity 360° rendered images on mobile access devices, our approach can be used in the future interdisciplinary research since high computational mobile devices with advanced equipped sensory systems are the new standard of our daily life.

Typ des Eintrags: Artikel
Erschienen: 2022
Autor(en): Weirich, Christopher ; Lin, Yandan ; Khanh, Tran Quoc
Art des Eintrags: Zweitveröffentlichung
Titel: Evidence for human-centric in-vehicle lighting: Part 2 — Modeling illumination based on color-opponents
Sprache: Englisch
Publikationsjahr: 2022
Ort: Darmstadt
Publikationsdatum der Erstveröffentlichung: 2022
Verlag: Frontiers Media S.A.
Titel der Zeitschrift, Zeitung oder Schriftenreihe: Frontiers in Neuroscience
Jahrgang/Volume einer Zeitschrift: 16
Kollation: 19 Seiten
DOI: 10.26083/tuprints-00022498
URL / URN: https://tuprints.ulb.tu-darmstadt.de/22498
Zugehörige Links:
Herkunft: Zweitveröffentlichung DeepGreen
Kurzbeschreibung (Abstract):

Illumination preference models are usually defined in a static scenery, rating common-colored objects by a single scale or semantic differentials. Recently, it was reported that two to three illumination characteristics are necessary to define a high correlation in a bright office-like environment. However, white-light illumination preferences for vehicle-occupants in a dynamic semi- to full automated modern driving context are missing. Here we conducted a global free access online survey using VR engines to create 360° sRGB static in-vehicle sceneries. A total of 164 participants from China and Europe answered three levels in our self-hosted questionnaire by using mobile access devices. First, the absolute perceptional difference should be defined by a variation of CCT for 3,000, 4,500, and 6,000 K or combinations, and light distribution, either in a spot- or spatial way. Second, psychological light attributes should be associated with the same illumination and scenery settings. Finally, we created four driving environments with varying external levels of interest and time of the day. We identified three key results: (1) Four illumination groups could be classified by applying nMDS. (2) Combinations of mixed CCTs and spatial light distributions outperformed compared single light settings (p < 0.05), suggesting that also during daylight conditions artificial light supplements are necessary. (3) By an image transformation in the IPT and CAM16 color appearance space, comparing external and in-vehicle scenery, individual illumination working areas for each driving scenery could be identified, especially in the dimension of chroma-, partially following the Hunt-Effect, and lightness contrast, which synchronizes the internal and external brightness level. We classified our results as a starting point, which we intend to prove in a follow-up-controlled laboratory study with real object arrangements. Also, by applying novel methods to display high fidelity 360° rendered images on mobile access devices, our approach can be used in the future interdisciplinary research since high computational mobile devices with advanced equipped sensory systems are the new standard of our daily life.

Freie Schlagworte: in-vehicle illumination, psychological-light relation, light-scene preferences, dynamic-scenery, in-vehicle human factors
Status: Verlagsversion
URN: urn:nbn:de:tuda-tuprints-224980
Sachgruppe der Dewey Dezimalklassifikatin (DDC): 600 Technik, Medizin, angewandte Wissenschaften > 620 Ingenieurwissenschaften und Maschinenbau
Fachbereich(e)/-gebiet(e): 18 Fachbereich Elektrotechnik und Informationstechnik
18 Fachbereich Elektrotechnik und Informationstechnik > Adaptive Lichttechnische Systeme und Visuelle Verarbeitung
Hinterlegungsdatum: 24 Okt 2022 13:09
Letzte Änderung: 25 Okt 2022 07:46
PPN:
Zugehörige Links:
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen