TU Darmstadt / ULB / TUbiblio

A Choice Model with Infinitely Many Latent Features

Görür, D. ; Jäkel, F. ; Rasmussen, C. E. (2006)
A Choice Model with Infinitely Many Latent Features.
Proceedings of the 23rd International Conference on Machine Learning. Pittsburgh, PA
Konferenzveröffentlichung, Bibliographie

Kurzbeschreibung (Abstract)

Elimination by aspects (EBA) is a probabilistic choice model describing how humans decide between several options. The options from which the choice is made are characterized by binary features and associated weights. For instance, when choosing which mobile phone to buy the features to consider may be: long lasting battery, color screen, etc. Existing methods for inferring the parameters of the model assume pre-specified features. However, the features that lead to the observed choices are not always known. Here, we present a non-parametric Bayesian model to infer the features of the options and the corresponding weights from choice data. We use the Indian buffet process (IBP) as a prior over the features. Inference using Markov chain Monte Carlo (MCMC) in conjugate IBP models has been previously described. The main contribution of this paper is an MCMC algorithm for the EBA model that can also be used in inference for other non-conjugate IBP models---this may broaden the use of IBP priors considerably.

Typ des Eintrags: Konferenzveröffentlichung
Erschienen: 2006
Autor(en): Görür, D. ; Jäkel, F. ; Rasmussen, C. E.
Art des Eintrags: Bibliographie
Titel: A Choice Model with Infinitely Many Latent Features
Sprache: Englisch
Publikationsjahr: 2006
Ort: Pittsburgh, PA
Veranstaltungstitel: Proceedings of the 23rd International Conference on Machine Learning
Veranstaltungsort: Pittsburgh, PA
URL / URN: https://dl.acm.org/citation.cfm?id=1143890
Kurzbeschreibung (Abstract):

Elimination by aspects (EBA) is a probabilistic choice model describing how humans decide between several options. The options from which the choice is made are characterized by binary features and associated weights. For instance, when choosing which mobile phone to buy the features to consider may be: long lasting battery, color screen, etc. Existing methods for inferring the parameters of the model assume pre-specified features. However, the features that lead to the observed choices are not always known. Here, we present a non-parametric Bayesian model to infer the features of the options and the corresponding weights from choice data. We use the Indian buffet process (IBP) as a prior over the features. Inference using Markov chain Monte Carlo (MCMC) in conjugate IBP models has been previously described. The main contribution of this paper is an MCMC algorithm for the EBA model that can also be used in inference for other non-conjugate IBP models---this may broaden the use of IBP priors considerably.

Fachbereich(e)/-gebiet(e): 03 Fachbereich Humanwissenschaften
03 Fachbereich Humanwissenschaften > Institut für Psychologie
03 Fachbereich Humanwissenschaften > Institut für Psychologie > Modelle höherer Kognition
Hinterlegungsdatum: 09 Jul 2018 09:06
Letzte Änderung: 12 Okt 2020 11:32
PPN:
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen