Schrom, Sebastian ; Hasler, Stephan (2019)
Domain Mixture: An Overlooked Scenario in Domain Adaptation.
18th International Conference on Machine Learning and Applications. Boca Raton, USA (16.12.2019-19.12.2019)
doi: 10.1109/ICMLA.2019.00013
Konferenzveröffentlichung, Bibliographie
Kurzbeschreibung (Abstract)
An image based object classification system trained on one domain usually shows decreased performance for other domains if data distributions differ significantly. There exist various domain adaptation approaches that improve generalization between domains. However, those approaches consider during transfer only the restricted setting where supervised samples of all competing classes are available from the source domain. We investigate here the more open and so far overlooked scenario, where during training only a subset of all competing classes is shown in one domain and another subset in another domain. We show the unexpected tendency of a deep learning classifier to use the domain origin as a prominent feature, which is resulting in a poor performance when testing on samples of unseen domain-class combinations. With an existing domain adaptation method this issue can be overcome, while additional unsupervised data of all unseen domain-class combinations is not essential. First results of this overlooked scenario are extensively discussed on a modified MNIST benchmark.
Typ des Eintrags: | Konferenzveröffentlichung |
---|---|
Erschienen: | 2019 |
Autor(en): | Schrom, Sebastian ; Hasler, Stephan |
Art des Eintrags: | Bibliographie |
Titel: | Domain Mixture: An Overlooked Scenario in Domain Adaptation |
Sprache: | Englisch |
Publikationsjahr: | 20 Dezember 2019 |
Verlag: | IEEE |
Buchtitel: | Proceedings: 18th International Conference on Machine Learning and Applications - ICMLA 2019 |
Veranstaltungstitel: | 18th International Conference on Machine Learning and Applications |
Veranstaltungsort: | Boca Raton, USA |
Veranstaltungsdatum: | 16.12.2019-19.12.2019 |
DOI: | 10.1109/ICMLA.2019.00013 |
Kurzbeschreibung (Abstract): | An image based object classification system trained on one domain usually shows decreased performance for other domains if data distributions differ significantly. There exist various domain adaptation approaches that improve generalization between domains. However, those approaches consider during transfer only the restricted setting where supervised samples of all competing classes are available from the source domain. We investigate here the more open and so far overlooked scenario, where during training only a subset of all competing classes is shown in one domain and another subset in another domain. We show the unexpected tendency of a deep learning classifier to use the domain origin as a prominent feature, which is resulting in a poor performance when testing on samples of unseen domain-class combinations. With an existing domain adaptation method this issue can be overcome, while additional unsupervised data of all unseen domain-class combinations is not essential. First results of this overlooked scenario are extensively discussed on a modified MNIST benchmark. |
Fachbereich(e)/-gebiet(e): | 18 Fachbereich Elektrotechnik und Informationstechnik 18 Fachbereich Elektrotechnik und Informationstechnik > Institut für Automatisierungstechnik und Mechatronik 18 Fachbereich Elektrotechnik und Informationstechnik > Institut für Automatisierungstechnik und Mechatronik > Regelungsmethoden und Robotik (ab 01.08.2022 umbenannt in Regelungsmethoden und Intelligente Systeme) |
Hinterlegungsdatum: | 22 Jan 2020 13:58 |
Letzte Änderung: | 03 Apr 2023 09:59 |
PPN: | |
Export: | |
Suche nach Titel in: | TUfind oder in Google |
Frage zum Eintrag |
Optionen (nur für Redakteure)
Redaktionelle Details anzeigen |