Catalogo Articoli (Spogli Riviste)

OPAC HELP

Titolo:
Nonlinear kernel-based statistical pattern analysis
Autore:
Ruiz, A; Lopez-de-Teruel, PE;
Indirizzi:
Univ Murcia, Dept Comp Sci, E-30001 Murcia, Spain Univ Murcia Murcia Spain E-30001 a, Dept Comp Sci, E-30001 Murcia, Spain Univ Murcia, Engn & Technol Dept, E-30001 Murcia, Spain Univ Murcia Murcia Spain E-30001 n & Technol Dept, E-30001 Murcia, Spain
Titolo Testata:
IEEE TRANSACTIONS ON NEURAL NETWORKS
fascicolo: 1, volume: 12, anno: 2001,
pagine: 16 - 32
SICI:
1045-9227(200101)12:1<16:NKSPA>2.0.ZU;2-#
Fonte:
ISI
Lingua:
ENG
Soggetto:
SUPPORT VECTOR MACHINES; FUNCTION APPROXIMATION; DENSITY-ESTIMATION; CROSS-VALIDATION; RECOGNITION; REGRESSION; NETWORKS;
Keywords:
Fisher's discriminant analysis; kernel expansion; Mahalanobis distance; minimum squared error (MSE) estimation; nonlinear feature extraction; nonparametric statistics; pseudoinverse; support vector machine (SVM);
Tipo documento:
Article
Natura:
Periodico
Settore Disciplinare:
Engineering, Computing & Technology
Citazioni:
54
Recensione:
Indirizzi per estratti:
Indirizzo: Ruiz, A Univ Murcia, Dept Comp Sci, E-30001 Murcia, Spain Univ Murcia Murcia Spain E-30001 Comp Sci, E-30001 Murcia, Spain
Citazione:
A. Ruiz e P.E. Lopez-de-Teruel, "Nonlinear kernel-based statistical pattern analysis", IEEE NEURAL, 12(1), 2001, pp. 16-32

Abstract

The eigenstructure of the second-order statistics of a multivariate randompopulation ran he inferred from the matrix of pairwise combinations of inner products of the samples, Therefore, it can be also efficiently obtained in the implicit, high-dimensional feature spaces defined by kernel functions, We elaborate on this property to obtain general expressions for immediate derivation of nonlinear counterparts of a number of standard pattern analysis algorithms, including principal component analysis, data compression and denoising, and Fisher's discriminant, The connection between kernel methods and nonparametric density estimation is also illustrated, Using these results we introduce the kernel version of Mahalanobis distance, which originates nonparametric models with unexpected and interesting properties, and also propose a kernel version of the minimum squared error (MSE) linear discriminant function, This learning machine is particularly simple and includes a number of generalized linear models such as the potential functions method or the radial basis function (RBF) network, Our results shed some light on the relative merit of feature spaces and inductive bias in the remarkable generalization properties of the support vector machine (SVM), Althoughin most situations the SVM obtains the lowest error rates, exhaustive experiments with synthetic and natural data show that simple kernel machines based on pseudoinversion are competitive in problems with appreciable class overlapping.

ASDD Area Sistemi Dipartimentali e Documentali, Università di Bologna, Catalogo delle riviste ed altri periodici
Documento generato il 29/03/20 alle ore 14:29:54