TU Darmstadt / ULB / TUbiblio

Towards Explaining Demographic Bias through the Eyes of Face Recognition Models

Fu, Biying ; Damer, Naser (2022)
Towards Explaining Demographic Bias through the Eyes of Face Recognition Models.
International Joint Conference on Biometrics (IJCB). Abu Dhabi, UAE (10.-13.10.2022)
doi: 10.1109/IJCB54206.2022.10007962
Konferenzveröffentlichung, Bibliographie

Kurzbeschreibung (Abstract)

Biases inherent in both data and algorithms make the fairness of widespread machine learning (ML)-based decision-making systems less than optimal. To improve the trustfulness of such ML decision systems, it is crucial to be aware of the inherent biases in these solutions and to make them more transparent to the public and developers. In this work, we aim at providing a set of explainability tool that analyse the difference in the face recognition models’ behaviors when processing different demographic groups. We do that by leveraging higher-order statistical information based on activation maps to build explainability tools that link the FR models’ behavior differences to certain facial regions. The experimental results on two datasets and two face recognition models pointed out certain areas of the face where the FR models react differently for certain demographic groups compared to reference groups. The outcome of these analyses interestingly aligns well with the results of studies that analyzed the anthropometric differences and the human judgment differences on the faces of different demographic groups. This is thus the first study that specifically tries to explain the biased behavior of FR models on different demographic groups and link it directly to the spatial facial features. The code is publicly available here.

Typ des Eintrags: Konferenzveröffentlichung
Erschienen: 2022
Autor(en): Fu, Biying ; Damer, Naser
Art des Eintrags: Bibliographie
Titel: Towards Explaining Demographic Bias through the Eyes of Face Recognition Models
Sprache: Englisch
Publikationsjahr: 2022
Verlag: IEEE
Buchtitel: 2022 IEEE International Joint Conference on Biometrics
Veranstaltungstitel: International Joint Conference on Biometrics (IJCB)
Veranstaltungsort: Abu Dhabi, UAE
Veranstaltungsdatum: 10.-13.10.2022
DOI: 10.1109/IJCB54206.2022.10007962
Kurzbeschreibung (Abstract):

Biases inherent in both data and algorithms make the fairness of widespread machine learning (ML)-based decision-making systems less than optimal. To improve the trustfulness of such ML decision systems, it is crucial to be aware of the inherent biases in these solutions and to make them more transparent to the public and developers. In this work, we aim at providing a set of explainability tool that analyse the difference in the face recognition models’ behaviors when processing different demographic groups. We do that by leveraging higher-order statistical information based on activation maps to build explainability tools that link the FR models’ behavior differences to certain facial regions. The experimental results on two datasets and two face recognition models pointed out certain areas of the face where the FR models react differently for certain demographic groups compared to reference groups. The outcome of these analyses interestingly aligns well with the results of studies that analyzed the anthropometric differences and the human judgment differences on the faces of different demographic groups. This is thus the first study that specifically tries to explain the biased behavior of FR models on different demographic groups and link it directly to the spatial facial features. The code is publicly available here.

Freie Schlagworte: Biometrics, Machine learning, Deep learning, Face recognition, Fairness
Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Graphisch-Interaktive Systeme
Hinterlegungsdatum: 06 Mär 2023 09:33
Letzte Änderung: 13 Jul 2023 13:50
PPN: 509617891
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen