TU Darmstadt / ULB / TUbiblio

Compact Models for Periocular Verification Through Knowledge Distillation

Boutros, Fadi ; Damer, Naser ; Fang, Meiling ; Raja, Kiran ; Kirchbuchner, Florian ; Kuijper, Arjan (2020)
Compact Models for Periocular Verification Through Knowledge Distillation.
19th International Conference of the Biometrics Special Interest Group (BIOSIG 2020). virtual Conference (16.-18.09.)
Konferenzveröffentlichung, Bibliographie

Kurzbeschreibung (Abstract)

Despite the wide use of deep neural network for periocular verification, achieving smaller deep learning models with high performance that can be deployed on low computational powered devices remains a challenge. In term of computation cost, we present in this paper a lightweight deep learning model with only 1.1m of trainable parameters, DenseNet-20, based on DenseNet architecture. Further, we present an approach to enhance the verification performance of DenseNet-20 via knowledge distillation. With the experiments on VISPI dataset captured with two different smartphones, iPhone and Nokia, we show that introducing knowledge distillation to DenseNet-20 training phase outperforms the same model trained without knowledge distillation where the Equal Error Rate (EER) reduces from 8.36% to 4.56% EER on iPhone data, from 5.33% to 4.64% EER on Nokia data, and from 20.98% to 15.54% EER on cross-smartphone data.

Typ des Eintrags: Konferenzveröffentlichung
Erschienen: 2020
Autor(en): Boutros, Fadi ; Damer, Naser ; Fang, Meiling ; Raja, Kiran ; Kirchbuchner, Florian ; Kuijper, Arjan
Art des Eintrags: Bibliographie
Titel: Compact Models for Periocular Verification Through Knowledge Distillation
Sprache: Englisch
Publikationsjahr: 2020
Veranstaltungstitel: 19th International Conference of the Biometrics Special Interest Group (BIOSIG 2020)
Veranstaltungsort: virtual Conference
Veranstaltungsdatum: 16.-18.09.
URL / URN: https://dl.gi.de/handle/20.500.12116/34340
Kurzbeschreibung (Abstract):

Despite the wide use of deep neural network for periocular verification, achieving smaller deep learning models with high performance that can be deployed on low computational powered devices remains a challenge. In term of computation cost, we present in this paper a lightweight deep learning model with only 1.1m of trainable parameters, DenseNet-20, based on DenseNet architecture. Further, we present an approach to enhance the verification performance of DenseNet-20 via knowledge distillation. With the experiments on VISPI dataset captured with two different smartphones, iPhone and Nokia, we show that introducing knowledge distillation to DenseNet-20 training phase outperforms the same model trained without knowledge distillation where the Equal Error Rate (EER) reduces from 8.36% to 4.56% EER on iPhone data, from 5.33% to 4.64% EER on Nokia data, and from 20.98% to 15.54% EER on cross-smartphone data.

Freie Schlagworte: Biometrics, Knowledge processing, Deep learning, Machine learning
Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Graphisch-Interaktive Systeme
20 Fachbereich Informatik > Mathematisches und angewandtes Visual Computing
Hinterlegungsdatum: 29 Sep 2020 07:53
Letzte Änderung: 29 Sep 2020 07:53
PPN:
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen