TU Darmstadt / ULB / TUbiblio

Template-Driven Knowledge Distillation for Compact and Accurate Periocular Biometrics Deep-Learning Models

Boutros, Fadi ; Damer, Naser ; Raja, Kiran ; Kirchbuchner, Florian ; Kuijper, Arjan (2022)
Template-Driven Knowledge Distillation for Compact and Accurate Periocular Biometrics Deep-Learning Models.
In: Sensors, 22 (5)
doi: 10.3390/s22051921
Artikel, Bibliographie

Dies ist die neueste Version dieses Eintrags.

Kurzbeschreibung (Abstract)

This work addresses the challenge of building an accurate and generalizable periocular recognition model with a small number of learnable parameters. Deeper (larger) models are typically more capable of learning complex information. For this reason, knowledge distillation (kd) was previously proposed to carry this knowledge from a large model (teacher) into a small model (student). Conventional KD optimizes the student output to be similar to the teacher output (commonly classification output). In biometrics, comparison (verification) and storage operations are conducted on biometric templates, extracted from pre-classification layers. In this work, we propose a novel template-driven KD approach that optimizes the distillation process so that the student model learns to produce templates similar to those produced by the teacher model. We demonstrate our approach on intra- and cross-device periocular verification. Our results demonstrate the superiority of our proposed approach over a network trained without KD and networks trained with conventional (vanilla) KD. For example, the targeted small model achieved an equal error rate (EER) value of 22.2% on cross-device verification without KD. The same model achieved an EER of 21.9% with the conventional KD, and only 14.7% EER when using our proposed template-driven KD.

Typ des Eintrags: Artikel
Erschienen: 2022
Autor(en): Boutros, Fadi ; Damer, Naser ; Raja, Kiran ; Kirchbuchner, Florian ; Kuijper, Arjan
Art des Eintrags: Bibliographie
Titel: Template-Driven Knowledge Distillation for Compact and Accurate Periocular Biometrics Deep-Learning Models
Sprache: Englisch
Publikationsjahr: 2022
Verlag: MDPI
Titel der Zeitschrift, Zeitung oder Schriftenreihe: Sensors
Jahrgang/Volume einer Zeitschrift: 22
(Heft-)Nummer: 5
Kollation: 14 Seiten
DOI: 10.3390/s22051921
Zugehörige Links:
Kurzbeschreibung (Abstract):

This work addresses the challenge of building an accurate and generalizable periocular recognition model with a small number of learnable parameters. Deeper (larger) models are typically more capable of learning complex information. For this reason, knowledge distillation (kd) was previously proposed to carry this knowledge from a large model (teacher) into a small model (student). Conventional KD optimizes the student output to be similar to the teacher output (commonly classification output). In biometrics, comparison (verification) and storage operations are conducted on biometric templates, extracted from pre-classification layers. In this work, we propose a novel template-driven KD approach that optimizes the distillation process so that the student model learns to produce templates similar to those produced by the teacher model. We demonstrate our approach on intra- and cross-device periocular verification. Our results demonstrate the superiority of our proposed approach over a network trained without KD and networks trained with conventional (vanilla) KD. For example, the targeted small model achieved an equal error rate (EER) value of 22.2% on cross-device verification without KD. The same model achieved an EER of 21.9% with the conventional KD, and only 14.7% EER when using our proposed template-driven KD.

Freie Schlagworte: biometrics, knowledge distillation, periocular verification
Sachgruppe der Dewey Dezimalklassifikatin (DDC): 000 Allgemeines, Informatik, Informationswissenschaft > 004 Informatik
Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Fraunhofer IGD
20 Fachbereich Informatik > Mathematisches und angewandtes Visual Computing
Hinterlegungsdatum: 02 Aug 2024 12:39
Letzte Änderung: 02 Aug 2024 12:39
PPN:
Export:
Suche nach Titel in: TUfind oder in Google

Verfügbare Versionen dieses Eintrags

Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen