Catalogo Articoli (Spogli Riviste)
OPAC HELP
Titolo: Oriented principal component analysis for large margin classifiers
Autore: Bermejo, S; Cabestany, J;
 Indirizzi:
 Univ Politecn Catalunya, Dept Elect Engn, ES08034 Barcelona, Spain Univ Politecn Catalunya Barcelona Spain ES08034 08034 Barcelona, Spain
 Titolo Testata:
 NEURAL NETWORKS
fascicolo: 10,
volume: 14,
anno: 2001,
pagine: 1447  1461
 SICI:
 08936080(200112)14:10<1447:OPCAFL>2.0.ZU;2A
 Fonte:
 ISI
 Lingua:
 ENG
 Soggetto:
 VECTOR QUANTIZATION; IMAGE COMPRESSION; CLASSIFICATION; PROJECTION;
 Keywords:
 large margin classifiers; oriented principal component analysis; cooperative learning; principal component neural networks; learningtolearn algorithms; feature extraction; online gradient descent; pattern recognition;
 Tipo documento:
 Article
 Natura:
 Periodico
 Settore Disciplinare:
 Engineering, Computing & Technology
 Citazioni:
 44
 Recensione:
 Indirizzi per estratti:
 Indirizzo: Bermejo, S Univ Politecn Catalunya, Dept Elect Engn, Jordi Girona 13,C4 Bldg, ES08034 Barcelona, Spain Univ Politecn Catalunya Jordi Girona 13,C4 Bldg Barcelona Spain ES08034



 Citazione:
 S. Bermejo e J. Cabestany, "Oriented principal component analysis for large margin classifiers", NEURAL NETW, 14(10), 2001, pp. 14471461
Abstract
Large margin classifiers (such as MLPs) are designed to assign training samples with high confidence (or margin) to one of the classes. Recent theoretical results of these systems show why the use of regularisation terms andfeature extractor techniques can enhance their generalisation properties. Since the optimal subset of features selected depends on the classificationproblem, but also on the particular classifier with which they are used, global learning algorithms for large margin classifiers that use feature extractor techniques are desired. A direct approach is to optimise a cost function based on the margin error, which also incorporates regularisation terms for controlling capacity. These terms must penalise a classifier with thelargest margin for the problem at hand. Our work shows that the inclusion of a PCA term can be employed for this purpose. Since PCA only achieves an optimal discriminatory projection for some particular distribution of data,the margin of the classifier can then be effectively controlled. We also propose a simple constrained search for the global algorithm in which the feature extractor and the classifier are trained separately. This allows a degree of flexibility for including heuristics that can enhance the search and the performance of the computed solution. Experimental results demonstrate the potential of the proposed method. (C) 2001 Elsevier Science Ltd. All rights reserved.
ASDD Area Sistemi Dipartimentali e Documentali, Università di Bologna, Catalogo delle riviste ed altri periodici
Documento generato il 04/04/20 alle ore 02:40:23