Computational Neuroscience

Our group investigates the neural mechanisms underlying learning and memory using computational methods. The main language of communication is English. We highly welcome bright students who would like to pursue a Bachelor/Master thesis in our group. Our unit is located at the Institute of Neural Computation and is a member of the Mercator Research Group (MRG) "Structure of Memory".

RESEARCH

We investigate the neural mechanisms underlying learning and memory using computational approaches. In particular, we study how a brain region, called the hippocampus, is involved in storing and retrieving episodic memories and in generating representations of space. There is overwhelming experimental evidence that the hippocampus is involved in both these functions, but it remains unclear why these two functions go together and how these functions are implemented in the hippocampus. To address these two questions, we employ a number of computational and theoretical approaches, including

  • biologically realistic neural network models, which nonetheless are highly simplified, that capture the essence of the neural circuit mechanism underlying learning and memory.
  • algorithmic models of the storage and retrieval of episodic memories.
  • theoretical models of the nature of episodic memory.
  • robotics simulations of spatial memory in rodents.

Below we describe a selection of projects that are currently ongoing in our group.

 

THE SHAPE OF THE HIPPOCAMPAL FORMATION AND ITS CONNECTIONS

Martin Pyka in our group has recently developed a method for modeling the anatomical layout of neurons and their projections. In this video, he uses his software to illustrate the peculiar gross anatomy of the hippocampal formation.

 

 THE INTERACTION BETWEEN SEMANTIC AND EPISODIC MEMORY

We are developing a computational model of encoding, storage and retrieval of episodic memory which takes into account the interrelation between episodic memory and semantic representations. In the model, episodes are stored in terms of higher order information, i.e., semantic representation, not their underlying sensory inputs. We investigate, for example, what role the semantic representation might play in episodic memory and how episodic memory can be used to infer semantic information.

 

ROBOTICS SIMULATION OF PLACE-SELECTIVE RESPONSES DRIVEN BY VISUAL INPUTS

The aim of the project is to understand how the rodent brain generates place-selective responses based on visual inputs alone. To model as closely as possible the conditions that a rodent faces, we let the small ePuck robot explore a real environment to collect images. These images are then processed with an algorithm called slow-feature analysis (SFA). In the future, we will study how to combine this visually-driven (allothetic) information with idiothetic spatial information to generate more robust location estimates.

 

THE GENERATION AND PROPAGATION OF NEURAL SEQUENCES IN THE HIPPOCAMPUS

Temporal sequences of neural activation can be observed in the hippocampus during the theta state and during sharp-wave ripple events. Since these temporal sequences are related to the ordering the cells' place fields, it has been suggested that they are important for spatial navigation, planing or learning. We are trying to understand how neural networks generate these sequences and how they propagate to downstream regions.

MODELING THE DYNAMICS OF DISEASE STATES IN DEPRESSION

Major depressive disorder (MDD) is a disabling condition that adversely affects a person general health, work or school life, sleeping and eating habits, and person's family. Despite intense research efforts, the response rate of antidepressant treatments are relatively low and the etiology and progression of MDD remain poorly understood. To advance our understanding of MDD, we use computational modelling as described in our article.

For further information consult the  project page (see above).

Download:
The model to simulate the dynamics of disease states in depression can be downloaded as a zip File and can be used with Matlab.

 

PARAMETRIC ANATOMICAL MODELING

With Parametric Anatomical Modeling (PAM), we propose a technique and a Python implementation to create artificial neural networks that meet connectivity patterns and connection lengths of large scale neural networks.

The basic idea of PAM is to trace neural, synaptic and intermediate layers from anatomical data and relate those layers to each other. With a set of mapping techniques, complex relationships between those layers can be defined to determine how axonal and dendritic projections traverse through space and where synapses are formed.

For further information consult the project page (see above).

Download:
PAM is available as an Addon for Blender and can be downloaded from a repository on Github. An importer for the neural network simulator NEST is available in a separate repository.

 

FREE HIGH-QUALITY FIGURES

Our group aims to provide neuroscientific community with a collection of high-quality SVG-figures for free use in publications, presentations, websites etc. via GitHub.

All SVG-files in the repository underlie the Create Commons Attribution 4.0 International License.

Download:
A ZIP-file containing all of the currently available figures as well as additional information concerning their creation can be downloaded here.

Modeling the Dynamics of Disease States in Depression

We study under what conditions the model can account for the occurrence and recurrence of depressive episodes and how we can model the effects of antidepressant treatments and cognitive behavioral therapy within the same dynamical systems model through changing a small subset of parameters.

Parametric Anatomical Modeling (PAM)

With Parametric Anatomical Modeling (PAM), we propose a technique and a Python implementation to create artificial neural networks that meet connectivity patterns and connection lengths of large scale neural networks.

Walther, T., & Würtz, R. P.. (2017). Unsupervised Acquisition of Human Body Models using Principles of Organic Computing. ArXiv e-prints. Retrieved from http://arxiv.org/abs/1704.03724
Cheng, S. (2017). Gedächtnisverbesserung: Möglichkeiten und kritische Betrachtung. In F. Hüttemann & Liggieri, K. (Eds.), Die Grenze . Diskurse des Transhumanismus. (p. invited contribution). Bielefeld: transcript Verlag.
Cheng, S. (2017). Consolidation of Episodic Memory: An Epiphenomenon of Semantic Learning. In N. Axmacher & Rasch, B. (Eds.), Cognitive Neuroscience of Memory Consolidation (pp. 57–72). Cham, Switzerland: Springer International Publishing. http://doi.org/10.1007/978-3-319-45066-7_4
Werning, M., & Cheng, S.. (2017). Taxonomy and Unity of Memory. In S. Bernecker & Michaelian, K. (Eds.), The Routledge Handbook of Philosophy of Memory (p. forthcoming). London: Routledge.
Babichev, A., Cheng, S., & Dabaghian, Y. A. (2016). Topological Schemas of Cognitive Maps and Spatial Learning. Frontiers in Computational Neuroscience, 10, 18. http://doi.org/10.3389/fncom.2016.00018
Cheng, S., & Werning, M. (2016). What is episodic memory if it is a natural kind? Synthese, 193(5), 1345–1385. http://doi.org/10.1007/s11229-014-0628-6
Cheng, S., Werning, M., & Suddendorf, T. (2016). Dissociating memory traces and scenario construction in mental time travel. Neuroscience & Biobehavioral Reviews, 60, 82–89. http://doi.org/10.1016/j.neubiorev.2015.11.011
Bayati, M., Valizadeh, A., Abbassian, A., & Cheng, S.. (2015). Self-organization of synchronous activity propagation in neuronal networks driven by local excitation. Frontiers in Computational Neuroscience, 9, 69. http://doi.org/10.3389/fncom.2015.00069
Demic, S., & Cheng, S.. (2014). Modeling the Dynamics of Disease States in Depression. PLOS ONE, 9(10), 1–14. http://doi.org/10.1371/journal.pone.0110358
Azizi, A. H., Schieferstein, N., & Cheng, S.. (2014). The transformation from grid cells to place cells is robust to noise in the grid pattern. Hippocampus, 24(8), 912–919. http://doi.org/10.1002/hipo.22306
Pyka, M., Klatt, S., & Cheng, S.. (2014). Parametric Anatomical Modeling: a method for modeling the anatomical layout of neurons and their projections. Frontiers in Neuroanatomy, 8, 91. http://doi.org/10.3389/fnana.2014.00091
Werning, M., & Cheng, S.. (2014). Is Episodic Memory a Natural Kind?-A Defense of the Sequence Analysis. In P. Bello, Guarini, M., McShane, M., & Scassellati, B. (Eds.), Proceedings of the 36th Annual Conference of the Cognitive Science Society (Vol. 2, pp. 964–69). Austin, TX: Cognitive Science Society. Retrieved from https://mindmodeling.org/cogsci2014/papers/173/paper173.pdf http://www.ruhr-uni-bochum.de/mam/phil-lang/content/cogsci2014_episodic_memory.pdf
Cheng, S., & Werning, M. (2013). Composition and replay of mnemonic sequences: The contributions of REM and slow-wave sleep to episodic memory. Behavioral and Brain Sciences, 36(06), 610–611. http://doi.org/10.1017/s0140525x13001234
Azizi, A. H., Wiskott, L., & Cheng, S.. (2013). A computational model for preplay in the hippocampus. Frontiers in Computational Neuroscience, 7, 161. http://doi.org/10.3389/fncom.2013.00161
Cheng, S. (2013). The CRISP theory of hippocampal function in episodic memory. Frontiers in Neural Circuits, 7, 88. http://doi.org/10.3389/fncir.2013.00088
Helduser, S., Cheng, S., & Güntürkün, O. (2013). Identification of two forebrain structures that mediate execution of memorized sequences in the pigeon. Journal of Neurophysiology, 109(4), 958–968. http://doi.org/10.1152/jn.00763.2012
Bayati, M., & Valizadeh, A. (2012). Effect of synaptic plasticity on the structure and dynamics of disordered networks of coupled neurons. Phys. Rev. E, 86(1), 011925. http://doi.org/10.1103/PhysRevE.86.011925
Crotty, P., Lasker, E., & Cheng, S.. (2012). Constraints on the synchronization of entorhinal cortex stellate cells. Phys. Rev. E, 86(1), 011908. http://doi.org/10.1103/PhysRevE.86.011908
Buhry, L., Azizi, A. H., & Cheng, S.. (2011). Reactivation, Replay, and Preplay: How It Might All Fit Together. Neural Plasticity, 2011, 1–11. http://doi.org/10.1155/2011/203462
Cheng, S., & Frank, L. M. (2011). The structure of networks that produce the transformation from grid cells to place cells . Neuroscience , 197, 293–306. http://doi.org/10.1016/j.neuroscience.2011.09.002
Cheng, S., & Frank, L. M. (2008). New Experiences Enhance Coordinated Neural Activity in the Hippocampus . Neuron , 57(2), 303–313. http://doi.org/10.1016/j.neuron.2007.11.035
Cheng, S., & Sabes, P. N. (2007). Calibration of Visually Guided Reaching Is Driven by Error-Corrective Learning and Internal Dynamics. Journal of Neurophysiology, 97(4), 3057–3069. http://doi.org/10.1152/jn.00897.2006
Cheng, S., & Sabes, P. N. (2006). Modeling Sensorimotor Learning with Linear Dynamical Systems. Neural Computation, 18(4), 760–793. http://doi.org/10.1162/neco.2006.18.4.760
Cheng, S., Petriconi, S., Pratt, S., Skoby, M., Gale, C., Jeon, S., et al. (2004). Statistical and dynamic models of charge balance functions. Phys. Rev. C, 69(5), 054906. http://doi.org/10.1103/PhysRevC.69.054906
Pratt, S., & Cheng, S.. (2003). Removing distortions from charge balance functions. Phys. Rev. C, 68(1), 014907. http://doi.org/10.1103/PhysRevC.68.014907
Cheng, S., & Pratt, S. (2003). Isospin fluctuations from a thermally equilibrated hadron gas. Phys. Rev. C, 67(4), 044904. http://doi.org/10.1103/PhysRevC.67.044904
Cheng, S. (2002). Statistical physics in a finite volume with absolute conservation laws.
Cheng, S. (2002). Modeling Relativistic Heavy Ion Collisions.
Cheng, S., Pratt, S., Csizmadia, P., Nara, Y., Molnár, D., Gyulassy, M., et al. (2002). Effect of finite-range interactions in classical transport theory. Phys. Rev. C, 65(2), 024901. http://doi.org/10.1103/PhysRevC.65.024901
Cheng, S., & Pratt, S. (2001). Quantum corrections for pion correlations involving resonance decays. Phys. Rev. C, 63(5), 054904. http://doi.org/10.1103/PhysRevC.63.054904
Computational Studies of the Role of Semantic Representation in Episodic Memory

The goal of this project is to extend these results to new object types and classes of objects. A good command of Python is required.

Generically trained SFA networks for cognitive mapping in simulated rodents

This project will assess potential advantages of generically trained SFA networks in cognitive mapping of virtual environments. An excellent command of Python (including numpy, matplotlib, scipy) and C/C++ is required. Experience with the Blender simulation environment would be advantageous.

Modeling Enhanced Replay of Neuronal Sequences in the Hippocampus

Modeling Enhanced Replay of Neuronal Sequences in the Hippocampus

Necessary and Sufficient Behavioral Evidence for Episodic Memory Traces

The goal of this project is to identify what behavioral evidence would be necessary and sufficient to claim that a nonhuman species possesses episodic memory traces. The project requires background knowledge of memory and experience in developing philosophical analyses.

Processing of Spatial Information in the Hippocampal Circuit

In this project, we study whether the responses of neurons in the hippocampal sublayers (DG, CA3, and CA1) resemble the responses of recorded place cells. Knowledge of the programming language Python is required.

Robust Generation of Spatio-temporal Activity Patterns in Neuronal Networks

The goal of this project is to better understand the sensitivity of the various neural network models to noise.

Self-organization of spatially tuned neural activity in volumetric virtual environments

This project is is geared towards extending work of Franzius et al. (2007) to cope with volumetric cognitive mapping based on Slow Feature Analysis methods. An excellent command of Python (including numpy, matplotlib, scipy) and C/C++ is required. Experience with the Blender simulation environment would be advantageous.