TU Darmstadt / ULB / TUbiblio

Pairwise Learning of Multilabel Classifications with Perceptrons

Loza Mencía, Eneldo and Fürnkranz, Johannes (2008):
Pairwise Learning of Multilabel Classifications with Perceptrons.
In: Proceedings of the 2008 IEEE International Joint Conference on Neural Networks (IJCNN-08), pp. 2900-2907, ISBN 978-1-4244-1821-3,
[Conference or Workshop Item]

Abstract

Multiclass multilabel perceptrons (MMP) have been proposed as an efficient incremental training algorithm for addressing a multilabel prediction task with a team of perceptrons. The key idea is to train one binary classifier per label, as is typically done for addressing multilabel problems, but to make the training signal dependent on the performance of the whole ensemble. In this paper, we propose an alternative technique that is based on a pairwise approach, i.e., we incrementally train a perceptron for each pair of classes. Our evaluation on four multilabel datasets shows that the multilabel pairwise perceptron (MLPP) algorithm yields substantial improvements over MMP in terms of ranking quality and overfitting resistance, while maintaining its efficiency. Despite the quadratic increase in the number of perceptrons that have to be trained, the increase in computational complexity is bounded by the average number of labels per training example.

Item Type: Conference or Workshop Item
Erschienen: 2008
Creators: Loza Mencía, Eneldo and Fürnkranz, Johannes
Title: Pairwise Learning of Multilabel Classifications with Perceptrons
Language: English
Abstract:

Multiclass multilabel perceptrons (MMP) have been proposed as an efficient incremental training algorithm for addressing a multilabel prediction task with a team of perceptrons. The key idea is to train one binary classifier per label, as is typically done for addressing multilabel problems, but to make the training signal dependent on the performance of the whole ensemble. In this paper, we propose an alternative technique that is based on a pairwise approach, i.e., we incrementally train a perceptron for each pair of classes. Our evaluation on four multilabel datasets shows that the multilabel pairwise perceptron (MLPP) algorithm yields substantial improvements over MMP in terms of ranking quality and overfitting resistance, while maintaining its efficiency. Despite the quadratic increase in the number of perceptrons that have to be trained, the increase in computational complexity is bounded by the average number of labels per training example.

Title of Book: Proceedings of the 2008 IEEE International Joint Conference on Neural Networks (IJCNN-08)
ISBN: 978-1-4244-1821-3
Divisions: 20 Department of Computer Science
20 Department of Computer Science > Knowl­edge En­gi­neer­ing
Date Deposited: 24 Jun 2011 15:07
Identification Number: doi:10.1109/IJCNN.2008.4634206
Export:
Suche nach Titel in: TUfind oder in Google
Send an inquiry Send an inquiry

Options (only for editors)

View Item View Item