TU Darmstadt / ULB / TUbiblio

Compact Models for Periocular Verification Through Knowledge Distillation

Boutros, Fadi and Damer, Naser and Fang, Meiling and Raja, Kiran and Kirchbuchner, Florian and Kuijper, Arjan (2020):
Compact Models for Periocular Verification Through Knowledge Distillation.
pp. 291-298, 19th International Conference of the Biometrics Special Interest Group (BIOSIG 2020), virtual Conference, 16.-18.09., ISBN 978-3-88579-700-5,
[Conference or Workshop Item]

Abstract

Despite the wide use of deep neural network for periocular verification, achieving smaller deep learning models with high performance that can be deployed on low computational powered devices remains a challenge. In term of computation cost, we present in this paper a lightweight deep learning model with only 1.1m of trainable parameters, DenseNet-20, based on DenseNet architecture. Further, we present an approach to enhance the verification performance of DenseNet-20 via knowledge distillation. With the experiments on VISPI dataset captured with two different smartphones, iPhone and Nokia, we show that introducing knowledge distillation to DenseNet-20 training phase outperforms the same model trained without knowledge distillation where the Equal Error Rate (EER) reduces from 8.36% to 4.56% EER on iPhone data, from 5.33% to 4.64% EER on Nokia data, and from 20.98% to 15.54% EER on cross-smartphone data.

Item Type: Conference or Workshop Item
Erschienen: 2020
Creators: Boutros, Fadi and Damer, Naser and Fang, Meiling and Raja, Kiran and Kirchbuchner, Florian and Kuijper, Arjan
Title: Compact Models for Periocular Verification Through Knowledge Distillation
Language: English
Abstract:

Despite the wide use of deep neural network for periocular verification, achieving smaller deep learning models with high performance that can be deployed on low computational powered devices remains a challenge. In term of computation cost, we present in this paper a lightweight deep learning model with only 1.1m of trainable parameters, DenseNet-20, based on DenseNet architecture. Further, we present an approach to enhance the verification performance of DenseNet-20 via knowledge distillation. With the experiments on VISPI dataset captured with two different smartphones, iPhone and Nokia, we show that introducing knowledge distillation to DenseNet-20 training phase outperforms the same model trained without knowledge distillation where the Equal Error Rate (EER) reduces from 8.36% to 4.56% EER on iPhone data, from 5.33% to 4.64% EER on Nokia data, and from 20.98% to 15.54% EER on cross-smartphone data.

ISBN: 978-3-88579-700-5
Uncontrolled Keywords: Biometrics, Knowledge processing, Deep learning, Machine learning
Divisions: 20 Department of Computer Science
20 Department of Computer Science > Interactive Graphics Systems
20 Department of Computer Science > Mathematical and Applied Visual Computing
Event Title: 19th International Conference of the Biometrics Special Interest Group (BIOSIG 2020)
Event Location: virtual Conference
Event Dates: 16.-18.09.
Date Deposited: 29 Sep 2020 07:53
Official URL: https://dl.gi.de/handle/20.500.12116/34340
Export:
Suche nach Titel in: TUfind oder in Google
Send an inquiry Send an inquiry

Options (only for editors)
Show editorial Details Show editorial Details