Evolutionary supervision of a dynamical neural network allows learning with on-going weights - Institut des Sciences Cognitives Accéder directement au contenu
Communication Dans Un Congrès Année : 2005

Evolutionary supervision of a dynamical neural network allows learning with on-going weights

David Meunier
Hélène Paugam-Moisy
  • Fonction : Auteur
  • PersonId : 828808

Résumé

Recent electrophysiological data show that synaptic weights are highly influenced by electrical activities displayed by neurons. Weights are not stable as assumed in classical neural network models. What is the nature of engrams, if not stored in synaptic weights? Adopting the theory of dynamical systems, which allows an implicit form of memory, we propose a new framework for learning, where synaptic weights are continuously adapted. Evolutionary computation has been applied to a population of dynamic neural networks evolving in a prey-predator environment. Each individual develops complex dynamic patterns of neuronal activity, underlied by multiple recurrent connections. We show that this method allows the emergence of learning capability through generations, as a by-product of evolution, since the behavioural performance of the network is not a priori based on this property.
Fichier principal
Vignette du fichier
MeunierPaugamMoisyIJCNN05.pdf (526.99 Ko) Télécharger le fichier
Loading...

Dates et versions

hal-00008702 , version 1 (13-09-2005)

Identifiants

  • HAL Id : hal-00008702 , version 1

Citer

David Meunier, Hélène Paugam-Moisy. Evolutionary supervision of a dynamical neural network allows learning with on-going weights. 2005, pp.1493-1498. ⟨hal-00008702⟩
76 Consultations
107 Téléchargements

Partager

Gmail Facebook X LinkedIn More