Master Thesis Projects

Master Thesis Projects are started once the complete master program is finished and all the credits have been obtained.
Projects for SSC and SIN students should last 4 months at the EPFL or 6 months in the industry or in another University.
Master Thesis Projects must be done individually.
Master Thesis Projects are worth 30 credits.
Students must have the approval of the Professor in charge of the laboratory before registering for the given project.  

Link to the Academic Calendar

List of Projects – Fall 2018

Computational Neuroscience of information transmission

Spiking neuron models are highly nonlinear, yet pair-wise correlations contain a lot of information as indicated in a recent paper in Nature Communications. The aim of this project is to redo the analysis of the paper for a particular nonlinear neuron-model.

Profile of candidate: EPFL master student in physics or computational science who has followed the class: Biological Modeling of Neural Networks. Please send your CV and an up-to-date grade sheet of results to Tilo Schwalger.

List of Projects – SPRING 2018:

Characterization of simulated plasticity-induction protocols

In the experimental literature, a large amount of different plasticity-induction protocols can be found. Building a single modeling framework that incorporates all different experimental techniques is challenging and often requires arbitrary modeling choices. The aim of this semester project is to identify the plasticity effects among protocols that are model-independent. In order to achieve this, the student will be required first to become familiar with the experimental literature and secondly, to use existing implementations of neural and plasticity models (or, possibly, to implement new ones). Finally, the student will analyze the simulations results in order to extract invariant features.

The preferred programming language is Python. The project is open to all students.

Interested students should send grades and CV to Chiara Gastaldi or Samuel Muscinelli.

Adaptive learning through surprise minimisation over extended time

Whenever there is a mismatch between our expectation and our actual experience we get surprised, and often need to update our belief about the world. In a framework recently developed in LCN, we mathematically defined a novel measure of surprise and we proposed a surprise-minimisation learning algorithm. The goal of this semester project is to develop an extension of this framework that will allow: 1) the contribution of the history of data points in learning, and 2) the adjustment of the subjective propensity to be surprised in a principled way.

The project requires a substantial background in mathematics and good programming skills in Python or Matlab.

Interested students should send grades and CV to Dane Corneil and Vasiliki Liakoni