Mallat, Khawla ; Damer, Naser ; Boutros, Fadi ; Dugelay, Jean-Luc (2019)
Robust Face Authentication Based on Dynamic Quality-weighted Comparison of Visible and Thermal-to-visible images to Visible Enrollments.
22nd International Conference on Information Fusion (FUSION'19). Ottawa, Canada (02.07.2019-05.07.2019)
Konferenzveröffentlichung, Bibliographie
Kurzbeschreibung (Abstract)
We introduce, in this paper, a new scheme of score level fusion for face authentication from visible and thermal face data. This proposed scheme provides a fast and straightforward integration into existing face recognition systems and does not require recollection of enrollment data in thermal spectrum. In addition to be used as a possible countermeasure against spoofing, this paper investigates the potential role of thermal spectrum in improving face recognition performances when employed under adversarial acquisition conditions. We consider a context where individuals have been enrolled solely in visible spectrum, and their identity will be verified using 2 sets of probes: visible and thermal. We show that the optimal way to proceed is to synthesis a visible image from the thermal face in order to create a synthetic-visible probe; and then to fuse scores resulting from comparisons between visible gallery with both visible probe and synthetic-visible probe. The thermal-to-visible face synthesis is performed using a Cascaded Refinement Network (CRN) and face features were extracted and matched using LightCNN and Local Binary Patterns (LBP). The fusion procedure is performed based on several quality measures computed on both visible and thermal-to-visible generated probes and compared to the visible gallery images.
Typ des Eintrags: | Konferenzveröffentlichung |
---|---|
Erschienen: | 2019 |
Autor(en): | Mallat, Khawla ; Damer, Naser ; Boutros, Fadi ; Dugelay, Jean-Luc |
Art des Eintrags: | Bibliographie |
Titel: | Robust Face Authentication Based on Dynamic Quality-weighted Comparison of Visible and Thermal-to-visible images to Visible Enrollments |
Sprache: | Englisch |
Publikationsjahr: | 2019 |
Veranstaltungstitel: | 22nd International Conference on Information Fusion (FUSION'19) |
Veranstaltungsort: | Ottawa, Canada |
Veranstaltungsdatum: | 02.07.2019-05.07.2019 |
URL / URN: | https://www.fusion2019.org/program.html |
Zugehörige Links: | |
Kurzbeschreibung (Abstract): | We introduce, in this paper, a new scheme of score level fusion for face authentication from visible and thermal face data. This proposed scheme provides a fast and straightforward integration into existing face recognition systems and does not require recollection of enrollment data in thermal spectrum. In addition to be used as a possible countermeasure against spoofing, this paper investigates the potential role of thermal spectrum in improving face recognition performances when employed under adversarial acquisition conditions. We consider a context where individuals have been enrolled solely in visible spectrum, and their identity will be verified using 2 sets of probes: visible and thermal. We show that the optimal way to proceed is to synthesis a visible image from the thermal face in order to create a synthetic-visible probe; and then to fuse scores resulting from comparisons between visible gallery with both visible probe and synthetic-visible probe. The thermal-to-visible face synthesis is performed using a Cascaded Refinement Network (CRN) and face features were extracted and matched using LightCNN and Local Binary Patterns (LBP). The fusion procedure is performed based on several quality measures computed on both visible and thermal-to-visible generated probes and compared to the visible gallery images. |
Freie Schlagworte: | Biometric fusion Biometrics Face recognition |
Fachbereich(e)/-gebiet(e): | 20 Fachbereich Informatik 20 Fachbereich Informatik > Graphisch-Interaktive Systeme |
Hinterlegungsdatum: | 14 Apr 2020 06:51 |
Letzte Änderung: | 14 Apr 2020 06:51 |
PPN: | |
Export: | |
Suche nach Titel in: | TUfind oder in Google |
Frage zum Eintrag |
Optionen (nur für Redakteure)
Redaktionelle Details anzeigen |