TU Darmstadt / ULB / TUbiblio

A Choice Model with Infinitely Many Latent Features

Görür, D. and Jäkel, F. and Rasmussen, C. E. (2006):
A Choice Model with Infinitely Many Latent Features.
Pittsburgh, PA, In: Proceedings of the 23rd International Conference on Machine Learning, Pittsburgh, PA, [Online-Edition: https://dl.acm.org/citation.cfm?id=1143890],
[Conference or Workshop Item]

Abstract

Elimination by aspects (EBA) is a probabilistic choice model describing how humans decide between several options. The options from which the choice is made are characterized by binary features and associated weights. For instance, when choosing which mobile phone to buy the features to consider may be: long lasting battery, color screen, etc. Existing methods for inferring the parameters of the model assume pre-specified features. However, the features that lead to the observed choices are not always known. Here, we present a non-parametric Bayesian model to infer the features of the options and the corresponding weights from choice data. We use the Indian buffet process (IBP) as a prior over the features. Inference using Markov chain Monte Carlo (MCMC) in conjugate IBP models has been previously described. The main contribution of this paper is an MCMC algorithm for the EBA model that can also be used in inference for other non-conjugate IBP models---this may broaden the use of IBP priors considerably.

Item Type: Conference or Workshop Item
Erschienen: 2006
Creators: Görür, D. and Jäkel, F. and Rasmussen, C. E.
Title: A Choice Model with Infinitely Many Latent Features
Language: English
Abstract:

Elimination by aspects (EBA) is a probabilistic choice model describing how humans decide between several options. The options from which the choice is made are characterized by binary features and associated weights. For instance, when choosing which mobile phone to buy the features to consider may be: long lasting battery, color screen, etc. Existing methods for inferring the parameters of the model assume pre-specified features. However, the features that lead to the observed choices are not always known. Here, we present a non-parametric Bayesian model to infer the features of the options and the corresponding weights from choice data. We use the Indian buffet process (IBP) as a prior over the features. Inference using Markov chain Monte Carlo (MCMC) in conjugate IBP models has been previously described. The main contribution of this paper is an MCMC algorithm for the EBA model that can also be used in inference for other non-conjugate IBP models---this may broaden the use of IBP priors considerably.

Place of Publication: Pittsburgh, PA
Divisions: 03 Department of Human Sciences
03 Department of Human Sciences > Institute for Psychology
03 Department of Human Sciences > Institute for Psychology > Models of Higher Cognition
Event Title: Proceedings of the 23rd International Conference on Machine Learning
Event Location: Pittsburgh, PA
Date Deposited: 09 Jul 2018 09:06
Official URL: https://dl.acm.org/citation.cfm?id=1143890
Export:

Optionen (nur für Redakteure)

View Item View Item