Catalogo Articoli (Spogli Riviste)

OPAC HELP

Titolo:
Predictability, complexity, and learning
Autore:
Bialek, W; Nemenman, I; Tishby, N;
Indirizzi:
NEC Res Inst, Princeton, NJ 08540 USA NEC Res Inst Princeton NJ USA 08540NEC Res Inst, Princeton, NJ 08540 USA Princeton Univ, Dept Phys, Princeton, NJ 08544 USA Princeton Univ Princeton NJ USA 08544 Dept Phys, Princeton, NJ 08544 USA Hebrew Univ Jerusalem, Sch Comp Sci & Engn, IL-91904 Jerusalem, Israel Hebrew Univ Jerusalem Jerusalem Israel IL-91904 -91904 Jerusalem, Israel Hebrew Univ Jerusalem, Ctr Neural Computat, IL-91904 Jerusalem, Israel Hebrew Univ Jerusalem Jerusalem Israel IL-91904 -91904 Jerusalem, Israel
Titolo Testata:
NEURAL COMPUTATION
fascicolo: 11, volume: 13, anno: 2001,
pagine: 2409 - 2463
SICI:
0899-7667(200111)13:11<2409:PCAL>2.0.ZU;2-1
Fonte:
ISI
Lingua:
ENG
Soggetto:
CONTINUOUS PROBABILITY-DISTRIBUTIONS; STOCHASTIC COMPLEXITY; STATISTICAL-MECHANICS; DENSITY-ESTIMATION; THERMODYNAMIC DEPTH; MUTUAL INFORMATION; FISHER INFORMATION; SPIKE TRAINS; ENTROPY; BOUNDS;
Tipo documento:
Review
Natura:
Periodico
Settore Disciplinare:
Life Sciences
Engineering, Computing & Technology
Citazioni:
107
Recensione:
Indirizzi per estratti:
Indirizzo: Bialek, W NEC Res Inst, 4 Independence Way, Princeton, NJ 08540 USA NEC Res Inst 4 Independence Way Princeton NJ USA 08540 08540 USA
Citazione:
W. Bialek et al., "Predictability, complexity, and learning", NEURAL COMP, 13(11), 2001, pp. 2409-2463

Abstract

We define predictive information I-pred(T) as the mutual information between the past and the future of a time series. Three qualitatively different behaviors are found in the limit of large observation times T:I-pred(T) canremain finite, grow logarithmically, or grow as a fractional power law. Ifthe time series allows us to learn a model with a finite number of parameters, then I-pred(T) grows logarithmically with a coefficient that counts the dimensionality of the model space. In contrast, power-law growth is associated, for example, with the learning of infinite parameter (or non-parametric) models such as continuous functions with smoothness constraints. Thereare connections between the predictive information and measures of complexity that have been defined both in learning theory and the analysis of physical systems through statistical mechanics and dynamical systems theory. Furthermore, in the same way that entropy provides the unique measure of available information consistent with some simple and plausible conditions, we argue that the divergent part of I-pred(T) provides the unique measure for the complexity of dynamics underlying a time series. Finally, we discuss how these ideas may be useful in problems in physics, statistics, and biology.

ASDD Area Sistemi Dipartimentali e Documentali, Università di Bologna, Catalogo delle riviste ed altri periodici
Documento generato il 29/03/20 alle ore 15:09:01