A probabilistic study of neural complexity - Archive ouverte HAL Access content directly
Preprints, Working Papers, ... Year : 2009

A probabilistic study of neural complexity

Abstract

G. Edelman, O. Sporns, and G. Tononi have introduced the neural complexity of a family of random variables, defining it as a specific average of mutual information over subfamilies. We show that their choice of weights satisfies two natural properties, namely exchangeability and additivity, and we call any functional satisfying these two properties an intricacy. We classify all intricacies in terms of probability laws on the unit interval and study the growth rate of maximal intricacies when the size of the system goes to infinity. For systems of a fixed size, we show that maximizers have small support and exchangeable systems have small intricacy. In particular, maximizing intricacy leads to spontaneous symmetry breaking and failure of uniqueness.
Fichier principal
Vignette du fichier
intricacy1-revised-arxiv-2009-18-12.pdf (346.51 Ko) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-00409143 , version 1 (06-08-2009)
hal-00409143 , version 2 (13-09-2009)
hal-00409143 , version 3 (18-12-2009)

Identifiers

Cite

Jerome Buzzi, Lorenzo Zambotti. A probabilistic study of neural complexity. 2009. ⟨hal-00409143v3⟩
163 View
269 Download

Altmetric

Share

Gmail Facebook X LinkedIn More