|Title:||Visual Information: Propagation, Adaptation, Population Coding|
|Speaker:||Rossum, Mark van|
|Group/Series/Folder:||Record Group 8.15 - Institute for Advanced Study|
Series 3 - Audio-visual Materials
|Notes:||StatPhysHK Conference. Talk no. 15|
Title from title slide.
IAS Program on Statistical Physics and Computational Neuroscience, held 2-19 July, 2013, at Hong Kong University of Science and Technology. Sponsors, Hong Kong Baptist University, Croucher Foundation, K. C. Wong Education Foundation.
StatPhysHK Conference, a satellite of STATPHYS 25, held 17-19 July, 2013, at Hong Kong University of Science and Technology and Hong Kong Baptist University.
Abstract: Synapses in the brain are presumably continuously subject to increases and decreases as the result of ongoing learning processes. This opens the danger of run-away plasticity (i.e. connections growing without bounds). Using computational models we demonstrate that, rather counter-intuitively, the observed increase in synaptic spine size that accompanies LTP, constrains synaptic plasticity, leading to 'soft-bound' plasticity. Next, we show theoretically how soft-bound plasticity maximizes the storage capacity of synapses, measured using Shannon Information, in a pattern recognition task. Homeostasis has often been introduced alongside synaptic plasticity models to ensure stability of neuronal networks and induce competition between synapses. We introduce a framework for linear homeostatic controllers. We show that stability of a single neuron does not guarantee stability of a network of neurons. In particular, we find that slow oscillations can develop that defeat the purpose of homeostasis. Moreover we find that adding more filters in the feedback loop can also have de-stabilizing effects on the network level. These results constrain biological and engineered homeostatic controllers.
Duration: 43 min.
|Appears in Series:||8.15:3 - Audio-visual Materials|
Videos for Public -- Distinguished Lectures