TU Darmstadt / ULB / TUbiblio

Template-Driven Knowledge Distillation for Compact and Accurate Periocular Biometrics Deep-Learning Models

Boutros, Fadi ; Damer, Naser ; Raja, Kiran ; Kirchbuchner, Florian ; Kuijper, Arjan (2022)
Template-Driven Knowledge Distillation for Compact and Accurate Periocular Biometrics Deep-Learning Models.
In: Sensors, 2022, 22 (5)
doi: 10.26083/tuprints-00021119
Article, Secondary publication, Publisher's Version

WarningThere is a more recent version of this item available.

Abstract

This work addresses the challenge of building an accurate and generalizable periocular recognition model with a small number of learnable parameters. Deeper (larger) models are typically more capable of learning complex information. For this reason, knowledge distillation (kd) was previously proposed to carry this knowledge from a large model (teacher) into a small model (student). Conventional KD optimizes the student output to be similar to the teacher output (commonly classification output). In biometrics, comparison (verification) and storage operations are conducted on biometric templates, extracted from pre-classification layers. In this work, we propose a novel template-driven KD approach that optimizes the distillation process so that the student model learns to produce templates similar to those produced by the teacher model. We demonstrate our approach on intra- and cross-device periocular verification. Our results demonstrate the superiority of our proposed approach over a network trained without KD and networks trained with conventional (vanilla) KD. For example, the targeted small model achieved an equal error rate (EER) value of 22.2% on cross-device verification without KD. The same model achieved an EER of 21.9% with the conventional KD, and only 14.7% EER when using our proposed template-driven KD.

Item Type: Article
Erschienen: 2022
Creators: Boutros, Fadi ; Damer, Naser ; Raja, Kiran ; Kirchbuchner, Florian ; Kuijper, Arjan
Type of entry: Secondary publication
Title: Template-Driven Knowledge Distillation for Compact and Accurate Periocular Biometrics Deep-Learning Models
Language: English
Date: 2022
Year of primary publication: 2022
Publisher: MDPI
Journal or Publication Title: Sensors
Volume of the journal: 22
Issue Number: 5
Collation: 14 Seiten
DOI: 10.26083/tuprints-00021119
URL / URN: https://tuprints.ulb.tu-darmstadt.de/21119
Corresponding Links:
Origin: Secondary publication DeepGreen
Abstract:

This work addresses the challenge of building an accurate and generalizable periocular recognition model with a small number of learnable parameters. Deeper (larger) models are typically more capable of learning complex information. For this reason, knowledge distillation (kd) was previously proposed to carry this knowledge from a large model (teacher) into a small model (student). Conventional KD optimizes the student output to be similar to the teacher output (commonly classification output). In biometrics, comparison (verification) and storage operations are conducted on biometric templates, extracted from pre-classification layers. In this work, we propose a novel template-driven KD approach that optimizes the distillation process so that the student model learns to produce templates similar to those produced by the teacher model. We demonstrate our approach on intra- and cross-device periocular verification. Our results demonstrate the superiority of our proposed approach over a network trained without KD and networks trained with conventional (vanilla) KD. For example, the targeted small model achieved an equal error rate (EER) value of 22.2% on cross-device verification without KD. The same model achieved an EER of 21.9% with the conventional KD, and only 14.7% EER when using our proposed template-driven KD.

Uncontrolled Keywords: biometrics, knowledge distillation, periocular verification
Status: Publisher's Version
URN: urn:nbn:de:tuda-tuprints-211196
Classification DDC: 000 Generalities, computers, information > 004 Computer science
Divisions: 20 Department of Computer Science
20 Department of Computer Science > Fraunhofer IGD
20 Department of Computer Science > Mathematical and Applied Visual Computing
Date Deposited: 11 Apr 2022 11:36
Last Modified: 12 Apr 2022 09:44
PPN:
Export:
Suche nach Titel in: TUfind oder in Google

Available Versions of this Item

Send an inquiry Send an inquiry

Options (only for editors)
Show editorial Details Show editorial Details