Next Events

Computational Neuroscience Seminars

 

Wednesday October 10 
Location: SV 1717

Time: 12:15

 

 

BMI Seminar: Sandro Romani (homepage)

Janelia Research Campus

 

Title:

Learning and memory in hippocampus and cortex, a tale of two theories

Abstract
Learning is thought to be mediated by changes of synaptic weights. Many physiology-based theories posit that synaptic modifications are induced by repeated and almost coincident spiking of pre- and post-synaptic neurons. At the level of behavior, learning can occur without repetitions and can link events that are separated in time by seconds. The hippocampus has been implicated in these forms of learning. Analysis and modeling of in-vivo recordings from region CA1 of rodents hippocampus, validated with in-vitro manipulations, reveal a novel learning rule: pre-synaptic spiking and post-synaptic complex spiking can be separated by a time-scale of seconds while still inducing potent (one-shot) changes in synaptic weights. This novel plasticity rule offers an immediate connection between behavior and synaptic changes in the hippocampus.
Cortex on the other hand is thought to learn by slowly shaping circuit dynamics to subserve complex behaviors. For instance, certain behaviors require short-term memory, the ability to maintain information in memory, in the absence of cues, for a time scale of seconds. The classic neural correlate of short-term memory is persistent selective activity, elevated neuronal firing that persists during memory maintenance, shows large and systematic ramping, and depends on the particular item kept in memory. Several hypotheses have been proposed to explain how neural circuits can support short-term memory. We use theory-driven optogenetic perturbations of pre-motor cortex in rodents performing a delayed binary response task. Circuit dynamics following recovery from perturbations reveal the presence of discrete attractors. We further devise a task structure to eliminate ramping activity, obtaining stationary activity patterns as observed in standard attractor network models.


Wednesday October 31 
Location: SV 1717

Time: 10:30

Danilo Jimenez Rezende (homepage)

Google Deepmind

 

Title:

Probabilistic Deep Learning:
Foundations, applications and open problems.

Abstract
Advances in deep generative models are at the forefront of deep learning research because of the promise they offer for allowing data-efficient learning, and for model-based reinforcement learning. In this talk I’ll review the foundations of probabilistic reasoning and generative modeling. I will then introduce modern approximations which allow for efficient large-scale training of a wide variety of generative models, demonstrate a few applications of these models to density estimation, missing data imputation, data compression and planning. Finally, I will discuss some of the open problems in the field.


Thursday, December 6th: Swiss Computational Neuroscience Seminars
Location: University of Zurich, Irchel Campus  
Time: TBA Bence Ölveczky (homepage)

Harvard University

Title:

TBA

Abstract

TBA

Ilya Nemenman (homepage)

Emory University

Title:

TBA

Abstract

TBA

Click here for a complete list of all future and past Swiss Computational Neuroscience seminars


Swiss Computational Neuroscience Seminars

Click here for a complete list of all future and past Swiss Computational Neuroscience seminars