Gonzalo Cogno Group at the Kavli Institute for Systems Neuroscience
Gonzalo Cogno Group
Kavli Institute's Neural dynamics and computation Group
About
Brain function emerges from the dynamic coordination of interconnected neurons. It is not clear, however, what the interplay between different types of neurons is, and how, together, they underlie cognition and behaviour. Addressing these questions requires a synergistic approach, where computational and experimental neuroscience work hand-in-hand.
For more information about our lab, please visit our external website: www.gonzalocognolab.com
Aim
Our group seeks to understand the mechanisms and dynamics underlying network computation.
Key research questions
- How are different functional cell types connected within and across circuits, and how are they coordinated at the network level?
- How do neural networks process information and represent features of the external world in the form of population codes?
- What are the mechanisms by which network dynamics (e.g. sequences of neural activity) reshape, or remain the same, across different behavioral paradigms?
Tools & Methods
We build models that explain features of experimental data and generate new hypotheses that are used to guide new experiments.
1. Computational models. We build spiking and firing rate neural network models that undergo learning processes, e.g. through plasticity rules or via supervised methods.
2. Analysis of neural data. We use approaches from mathematics, statistical physics, information theory, dynamical systems theory and machine learning.
3. Large-scale recordings. We perform large scale recordings in behaving animals using high-site-count silicon probes.
Kavli Communications Hub
The Dimensionality Reduction and Population Dynamics in Neural Data conference were held at Nordita in Stockholm 11-14 February 2020. Most parts of the conference were recorded (see links below).
About the conference
The brain represents and processes information through the activity of many neurons whose firing patterns are correlated with each other in non-trivial ways. These correlations, in general, imply that the activity of a population of neurons involved in a task has a lower dimensional representation. Naturally, then, discovering and understanding such representations are important steps in understanding the operations of the nervous system, and theoretical and experimental neuroscientists have been making interesting progress on this subject. The aim of this conference is to gather together a number of key players in the effort for developing methods for dimensionality reduction in neural data and studying the population dynamics of networks of neurons from this angle. We aim to review the current approaches to the problem, identify the major questions that need to be addressed in the future, and discuss how we should move forward with those questions.
See recordings from the conference here:
Conference in Stockholm playlist
Tuesday 11/02/2020
Sara Solla (Northwestern University) Neural manifolds for the stable control of movement
Matteo Marsili (ICTP) Multiscale relevance and informative encoding in neuronal spike trains
Srdjan Ostojic (ENS) Disentangling the roles of dimensionality and cell classes in neural computations (Lecture not recorded)
Wednesday 12/02/2020
Taro Toyoizumi (Riken) A local synaptic update rule for ICA and dimensionality reduction
Soledad Gonzalo Cogno (Kavli Institute, NTNU) Stereotyped population dynamics in the medial entorhinal cortex (Lecture not recorded)
Tatiana Engel (CSHL) Discovering interpretable models of neural population dynamics from data
Thursday 13/02/2020
Benjamin Dunn (Math Department, NTNU) TBA (Lecture not recorded)
Sophie Deneve (ENS) TBA (Lecture not recorded)
Barbara Feulner (Imperial College London) Learning within and outside of the neural manifold
Friday 14/02
Mark Humphries (University of Nottingham) Strong and weak principles for neural dimension reduction
Devika Narain (Erasmus University Medical Center) Bayesian time perception through latent cortical dynamics