Catalogo Articoli (Spogli Riviste)

OPAC HELP

Titolo:
A unifying information-theoretic framework for independent component analysis
Autore:
Lee, TW; Girolami, M; Bell, AJ; Sejnowski, TJ;
Indirizzi:
Salk Inst, Howard Hughes Med Inst, Computat Neurobiol Lab, La Jolla, CA 92037 USA Salk Inst La Jolla CA USA 92037 tat Neurobiol Lab, La Jolla, CA 92037 USA Tech Univ Berlin, Inst Elect, D-10587 Berlin, Germany Tech Univ Berlin Berlin Germany D-10587 t Elect, D-10587 Berlin, Germany Univ Paisley, Dept Comp & Informat Sci, Paisley PA1 2BE, Renfrew, ScotlandUniv Paisley Paisley Renfrew Scotland PA1 2BE PA1 2BE, Renfrew, Scotland Univ Calif San Diego, Dept Biol, La Jolla, CA 92093 USA Univ Calif San Diego La Jolla CA USA 92093 t Biol, La Jolla, CA 92093 USA
Titolo Testata:
COMPUTERS & MATHEMATICS WITH APPLICATIONS
fascicolo: 11, volume: 39, anno: 2000,
pagine: 1 - 21
SICI:
0898-1221(200006)39:11<1:AUIFFI>2.0.ZU;2-R
Fonte:
ISI
Lingua:
ENG
Soggetto:
BLIND SIGNAL SEPARATION; LEARNING ALGORITHMS; PROJECTION PURSUIT; MUTUAL INFORMATION; MIXTURE; REPRESENTATION; DISTRIBUTIONS; MAXIMIZATION; NETWORK; RULE;
Keywords:
blind source separation; ICA; entropy; information maximization; maximum likelihood estimation;
Tipo documento:
Article
Natura:
Periodico
Settore Disciplinare:
Engineering, Computing & Technology
Citazioni:
85
Recensione:
Indirizzi per estratti:
Indirizzo: Lee, TW Salk Inst, Howard Hughes Med Inst, Computat Neurobiol Lab, La Jolla, CA 92037 USA Salk Inst La Jolla CA USA 92037 obiol Lab, La Jolla, CA 92037 USA
Citazione:
T.W. Lee et al., "A unifying information-theoretic framework for independent component analysis", COMPUT MATH, 39(11), 2000, pp. 1-21

Abstract

We show that different theories recently proposed for independent component analysis (ICA) lead to the same iterative learning algorithm for blind separation of mixed independent sources. We review those theories and suggestthat information theory can be used to unify several lines of research. Pearlmutter and Parra [1] and Cardoso [2] showed that the infomax approach ofBell and Sejnowski [3] and the maximum likelihood estimation approach are equivalent. We show that negentropy maximization also has equivalent properties, and therefore, all three approaches yield the same learning rule for a fixed nonlinearity. Girolami and Fyfe [4] have shown that the nonlinear principal component analysis (PCA) algorithm of Karhunen and Joutsensalo [5]and Oja [6] can also be viewed from information-theoretic principles sinceit minimizes the sum of squares of the fourth-order marginal cumulants, and therefore, approximately minimizes the mutual information [7]. Lambert [8] has proposed different Bussgang cost functions for multichannel blind deconvolution. We show how the Bussgang property relates to the infomax principle. Finally, we discuss convergence and stability as well as future research issues in blind source separation. (C) 2000 Elsevier Science Ltd. All rights reserved.

ASDD Area Sistemi Dipartimentali e Documentali, Università di Bologna, Catalogo delle riviste ed altri periodici
Documento generato il 04/04/20 alle ore 12:13:56