Theory of Neural Systems

What we do

We are an interdisciplinary research group focusing on principles of self-organization in neural systems, ranging from artificial neural networks to the hippocampus. By bringing together machine learning and computational neuroscience we explore ways of extracting representations from data that are useful for goal-directed learning, such as reinforcement learning.

Unsupervised Learning with Slow Feature Analysis

The goal of unsupervised learning algorithms is to learn a structure in data. This has the benefit of massively decreasing complexities of data-driven problems which would otherwise not be feasible to solve. A learning objective needs to be determined to guide this search of structure in the data. Our group is primarily interested in the learning algorithms defined by the slowness meta-learning objective: the most interesting things are the most slowly changing over time. A concrete method of realizing this is with the slow feature analysis algorithm, which extracts a set number of slowly changing features from a data set. As an example, this can identify from a wildlife video the location and angle of a present fish while discarding unimportant information, such as the value of every pixel at each time-frame.

Memory-based Reinforcement Learning

In recent years, we have seen machines becoming capable of solving tasks of increasing complexity without explicit instruction. For example, a single neural network outperformed several human experts in a selection of classical video games (Atari 2600) based on visual input. Interactive learning problems - an agent interacting with an environment - can be cast into the theoretical framework of "Reinforcement Learning", which is concerned with ways of maximizing accumulated rewards from such a setting. Our main interest lies in finding optimal behavior in visually complex environments, in which incompleteness of information is a common issue. One focus of our research lies in the design of algorithms that can use memory to overcome this problem by aggregating partial information from a history of observations.

System Level Modeling of the Hippocampus

The hippocampus is a central processing structure in the mammalian brain. In humans it is tightly linked to episodic memory; without it we would not be able to acquire new memories. In rodents the hippocampus has been found to house a neural encoding of the surrounding space in the form of place cells, grid cells, and head direction cells.

In our group we develop computational models for both of these aspects of the hippocampus in an effort to learn how these two fundamental functions interact with each other as well as how the underlying neuronal processes are capable of performing both of them in the same structure.

    2018

  • The Interaction between Semantic Representation and Episodic Memory
    Fang, J., Rüther, N., Bellebaum, C., Wiskott, L., & Cheng, S.
    Neural Computation, 30(2), 293–332
  • Challenges in High-dimensional Controller Design with Evolution Strategies
    Müller, N., & Glasmachers, T.
    In Parallel Problem Solving from Nature (PPSN XVI) Springer
  • Dual SVM Training on a Budget
    Qaadan, S., Schüler, M., & Glasmachers, T.
    arXiv.org
  • Slowness as a Proxy for Temporal Predictability: An Empirical Comparison
    Weghenkel, B., & Wiskott, L.
    Neural Computation, 30(5), 1151–1179
  • 2017

  • Gaussian-binary restricted Boltzmann machines for modeling natural image statistics
    Melchior, J., Wang, N., & Wiskott, L.
    PLOS ONE, 12(2), 1–24
  • Generating sequences in recurrent neural networks for storing and retrieving episodic memories
    Bayati, M., Melchior, J., Wiskott, L., & Cheng, S.
    In Proc. 26th Annual Computational Neuroscience Meeting (CNS*2017): Part 2
  • Experience-Dependency of Reliance on Local Visual and Idiothetic Cues for Spatial Representations Created in the Absence of Distal Information
    Draht, F., Zhang, S., Rayan, A., Schönfeld, F., Wiskott, L., & Manahan-Vaughan, D.
    Frontiers in Behavioral Neuroscience, 11(92)
  • Extensions of Hierarchical Slow Feature Analysis for Efficient Classification and Regression on High-Dimensional Data
    Escalante-B., A. N.
    Doctoral thesis, Ruhr University Bochum, Faculty of Electrical Engineering and Information Technology
  • Intrinsically Motivated Acquisition of Modular Slow Features for Humanoids in Continuous and Non-Stationary Environments
    Kompella, V. R., & Wiskott, L.
    arXiv preprint arXiv:1701.04663
  • Graph-based predictable feature analysis
    Weghenkel, B., Fischer, A., & Wiskott, L.
    Machine Learning, 1–22
  • 2016

  • Theoretical analysis of the optimal free responses of graph-based SFA for the design of training graphs.
    Escalante-B., A. N., & Wiskott, L.
    Journal of Machine Learning Research, 17(157), 1–36
  • Improved graph-based SFA: Information preservation complements the slowness principle
    Escalante-B., A. N., & Wiskott, L.
    e-print arXiv:1601.03945
  • How to Center Deep Boltzmann Machines
    Melchior, J., Fischer, A., & Wiskott, L.
    Journal of Machine Learning Research, 17(99), 1–61
  • A computational model of spatial encoding in the hippocampus
    Schönfeld, F.
    Doctoral thesis, Ruhr-Universität Bochum
  • 2015

  • Theoretical Analysis of the Optimal Free Responses of Graph-Based SFA for the Design of Training Graphs
    Escalante-B., A. N., & Wiskott, L.
    e-print arXiv:1509.08329 (Accepted in Journal of Machine Learning Research)
  • Continual curiosity-driven skill acquisition from high-dimensional video inputs for humanoid robots
    Kompella, V. R., Stollenga, M., Luciw, M., & Schmidhuber, J.
    Artificial Intelligence
  • Memory Storage Fidelity in the Hippocampal Circuit: The Role of Subregions and Input Statistics
    Neher, T., Cheng, S., & Wiskott, L.
    PLoS Computational Biology, 11(5), e1004250
  • Predictable Feature Analysis.
    Richthofer, S., & Wiskott, L.
    In Workshop New Challenges in Neural Computation 2015 (NC2) (pp. 68–75)
  • Predictable Feature Analysis.
    Richthofer, S., & Wiskott, L.
    In 2015 IEEE 14th International Conference on Machine Learning and Applications (ICMLA) (pp. 190–196)
  • Modeling place field activity with hierarchical slow feature analysis
    Schoenfeld, F., & Wiskott, L.
    Frontiers in Computational Neuroscience, 9(51)
  • Modeling place field activity with hierarchical slow feature analysis
    Schönfeld, F., & Wiskott, L.
    frontiers in Computational Neuroscience, 9(51)
  • 2014

  • Slow Feature Analysis on Retinal Waves Leads to V1 Complex Cells.
    Dähne, S., Wilbert, N., & Wiskott, L.
    PLoS Comput Biol, 10(5), e1003564
  • Slow Feature Analysis for Curiosity-Driven Agents, 2014 IEEE WCCI Tutorial
  • Slowness Learning for Curiosity-Driven Agents
    Kompella, V. R.
    Doctoral thesis, Università della svizzera italiana (USI)
  • An Anti-hebbian Learning Rule to Represent Drive Motivations for Reinforcement Learning
    Kompella, V. R., Kazerounian, S., & Schmidhuber, J.
    In From Animals to Animats 13 (pp. 176–187) Springer International Publishing
  • Explore to See, Learn to Perceive, Get the Actions for Free: SKILLABILITY
    Kompella, V. R., Stollenga, M. F., Luciw, M. D., & Schmidhuber, J.
    In Proceedings of IEEE Joint Conference of Neural Networks (IJCNN)
  • An Extension of Slow Feature Analysis for Nonlinear Blind Source Separation
    Sprekeler, H., Zito, T., & Wiskott, L.
    Journal of Machine Learning Research, 15, 921–947
  • Modeling correlations in spontaneous activity of visual cortex with Gaussian-binary deep Boltzmann machines
    Wang, N., Jancke, D., & Wiskott, L.
    In Proc. Bernstein Conference for Computational Neuroscience, Sep 3–5,Göttingen, Germany (pp. 263–264) BFNT Göttingen
  • Modeling correlations in spontaneous activity of visual cortex with centered Gaussian-binary deep Boltzmann machines
    Wang, N., Jancke, D., & Wiskott, L.
    In Proc. International Conference of Learning Representations (ICLR′14, workshop), Apr 14–16,Banff, Alberta, Canada
  • Gaussian-binary Restricted Boltzmann Machines on Modeling Natural Image statistics
    Wang, N., Melchior, J., & Wiskott, L.
    (Vol. 1401.5900) arXiv.org e-Print archive
  • Learning predictive partitions for continuous feature spaces
    Weghenkel, B., & Wiskott, L.
    In Proc. 22nd European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN), Apr 23-25, Bruges, Belgium (pp. 577–582)
  • Elastic Bunch Graph Matching
    Wiskott, L., Würtz, R. P., & Westphal, G.
    Scholarpedia, 9, 10587
  • Spatial representations of place cells in darkness are supported by path integration and border information
    Zhang, S., Schoenfeld, F., Wiskott, L., & Manahan-Vaughan, D.
    Frontiers in Behavioral Neuroscience, 8(222)
  • 2013

  • A computational model for preplay in the hippocampus
    Azizi, A. H., Wiskott, L., & Cheng, S.
    Frontiers in Computational Neuroscience, 7, 161
  • A computational model for preplay in the hippocampus
    Azizi, A. H., Wiskott, L., & Cheng, S.
    Frontiers in Computational Neuroscience, 7(161), 1–15
  • How to Solve Classification and Regression Problems on High-Dimensional Data with a Supervised Extension of Slow Feature Analysis
    Escalante-B., A. -N., & Wiskott, L.
    Cognitive Sciences EPrint Archive (CogPrints)
  • How to Solve Classification and Regression Problems on High-Dimensional Data with a Supervised Extension of Slow Feature Analysis
    Escalante-B., A. N., & Wiskott, L.
    Journal of Machine Learning Research, 14, 3683–3719
  • Deep Hierarchies in the Primate Visual Cortex: What Can We Learn For Computer Vision?
    Krüger, N., Janssen, P., Kalkan, S., Lappe, M., Leonardis, A., Piater, J., et al.
    IEEE Trans. on Pattern Analysis and Machine Intelligence, 35(8), 1847–1871
  • An intrinsic value system for developing multiple invariant representations with incremental slowness learning
    Luciw*, M., Kompella*, V., Kazerounian, S., & Schmidhuber, J.
    Frontiers in neurorobotics, 7, *Joint first authors
  • How to Center Binary Restricted Boltzmann Machines
    Melchior, J., Fischer, A., Wang, N., & Wiskott, L.
    (Vol. 1311.1354) arXiv.org e-Print archive
  • Are memories really stored in the hippocampal CA3 region?
    Neher, T., Cheng, S., & Wiskott, L.
    BoNeuroMed
  • Are memories really stored in the hippocampal CA3 region?
    Neher, T., Cheng, S., & Wiskott, L.
    In Proc. 10th Göttinger Meeting of the German Neuroscience Society, Mar 13-16, Göttingen, Germany (p. 104)
  • Predictable Feature Analysis
    Richthofer, S., & Wiskott, L.
    arXiv.org e-Print archive
  • RatLab: An easy to use tool for place code simulations
    Schoenfeld, F., & Wiskott, L.
    Frontiers in Computational Neuroscience, 7(104)
  • Modeling correlations in spontaneous activity of visual cortex with centered Gaussian-binary deep Boltzmann machines
    Wang, N., Jancke, D., & Wiskott, L.
    arXiv preprint arXiv:1312.6108
  • 2012

  • Slow Feature Analysis: Perspectives for Technical Applications of a Versatile Learning Algorithm
    Escalante-B., A. N., & Wiskott, L.
    Künstliche Intelligenz [Artificial Intelligence], 26(4), 341–348
  • Incremental slow feature analysis: Adaptive low-complexity slow feature updating from high-dimensional input streams
    Kompella, V. R., Luciw, M., & Schmidhuber, J.
    Neural Computation, 24(11), 2994–3024
  • Autonomous learning of abstractions using curiosity-driven modular incremental slow feature analysis
    Kompella, V. R., Luciw, M., Stollenga, M., Pape, L., & Schmidhuber, J.
    In Development and Learning and Epigenetic Robotics (ICDL), 2012 IEEE International Conference on (pp. 1–8) IEEE
  • Collective-reward based approach for detection of semi-transparent objects in single images
    Kompella, V. R., & Sturm, P.
    Computer Vision and Image Understanding, 116(4), 484–499
  • Hierarchical incremental slow feature analysis
    Luciw, M., Kompella, V. R., & Schmidhuber, J.
    Workshop on Deep Hierarchies in Vision
  • Sensory integration of place and head-direction cells in a virtual environment
    Schönfeld, F., & Wiskott, L.
    Poster at NeuroVisionen 8, 26. Oct 2012, Aachen, Germany
  • Sensory integration of place and head-direction cells in a virtual environment
    Schönfeld, F., & Wiskott, L.
    Poster at the 8th FENS Forum of Neuroscience, Jul 14–18, Barcelona, Spain
  • An Analysis of Gaussian-Binary Restricted Boltzmann Machines for Natural Images
    Wang, N., Melchior, J., & Wiskott, L.
    In Proc. 20th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, Apr 25–27, Bruges, Belgium (pp. 287–292)
  • 2011

  • Slow feature analysis
    Wiskott, L., Berkes, P., Franzius, M., Sprekeler, H., & Wilbert, N.
    Scholarpedia, 6(4), 5282
  • Heuristic Evaluation of Expansions for Non-Linear Hierarchical Slow Feature Analysis.
    Escalante, A., & Wiskott, L.
    In Proc. The 10th Intl. Conf. on Machine Learning and Applications (ICMLA′11), Dec 18–21, Honolulu, Hawaii (pp. 133–138) IEEE Computer Society
  • Incremental Slow Feature Analysis.
    Kompella, V. R., Luciw, M. D., & Schmidhuber, J.
    IJCAI, 11, 1354–1359
  • Autoincsfa and vision-based developmental learning for humanoid robots
    Kompella, V. R., Pape, L., Masci, J., Frank, M., & Schmidhuber, J.
    In Humanoid Robots (Humanoids), 2011 11th IEEE-RAS International Conference on (pp. 622–629) IEEE
  • Detection and avoidance of semi-transparent obstacles using a collective-reward based approach
    Kompella, V. R., & Sturm, P.
    In Robotics and Automation (ICRA), 2011 IEEE International Conference on (pp. 3469–3474) IEEE
  • 2010

  • 3-SAT on CUDA: Towards a massively parallel SAT solver
    Meyer, Q., Schönfeld, F., Stamminger, M., & Wanka, R.
    In 2010 International Conference on High Performance Computing Simulation (pp. 306–313)
  • Building a Side Channel Based Disassembler
    Eisenbarth, T., Paar, C., & Weghenkel, B.
    In M. L. Gavrilova, Tan, C. J. K., & Moreno, E. D. (Eds.), Transactions on Computational Science X: Special Issue on Security in Computing, Part I (pp. 78–99) Berlin, Heidelberg: Springer Berlin Heidelberg
  • Gender and Age Estimation from Synthetic Face Images with Hierarchical Slow Feature Analysis.
    Escalante, A., & Wiskott, L.
    In International Conference on Information Processing and Management of Uncertainty in Knowledge-Based Systems (IPMU′10), Jun 28 – Jul 2, Dortmund
  • 2006

  • Analytical derivation of complex cell properties from the slowness principle
    Sprekeler, H., & Wiskott, L.
    In Proc. 2nd Bernstein Symposium for Computational Neuroscience, Oct 1–3, Berlin, Germany (p. 67) Bernstein Center for Computational Neuroscience (BCCN) Berlin
  • Analytical derivation of complex cell properties from the slowness principle
    Sprekeler, H., & Wiskott, L.
    In Proc. Berlin Neuroscience Forum, Jun 8–10, Bad Liebenwalde, Germany (pp. 65–66) Berlin: Max-Delbrück-Centrum für Molekulare Medizin (MDC)
  • Analytical derivation of complex cell properties from the slowness principle
    Sprekeler, H., & Wiskott, L.
    In Proc. 15th Annual Computational Neuroscience Meeting (CNS′06), Jul 16–20, Edinburgh, Scotland

A brief introduction to Slow Feature Analysis

One of the main research topics of the TNS group is called Slow Feature Analysis. Slow feature analysis (SFA) is an unsupervised learning method to extract the slowest or smoothest underlying functions or features from a time series. This can be used for dimensionality reduction, regression and classification. In this post we will provide a code example where SFA is applied, to help motivate the method. Then we will go into more detail about the math behind the method and finally provide links to other good resources on the material.

An extension to Slow Feature Analysis (xSFA)

Following our previous tutorial on Slow Feature Analysis (SFA) we now talk about xSFA - an unsupervised learning algorithm and extension to the original SFA algorithm that utilizes the slow features generated by SFA to reconstruct the individual sources of a nonlinear mixture, a process also known as Blind Source Separation (e.g. the reconstruction of individual voices from the recording of a conversation between multiple people). In this tutorial, we will provide a short example to demonstrate the capabilities of xSFA, discuss its limits, and offer some pointers on how and when to apply it. We also take a closer look at the theoretical background of xSFA to provide an intuition for the mathematics behind it.

Modeling the hippocampus, part I: Why the hippcampus?

In this multi-part series I'd like to give an introduction into how computational neuroscience can work hand in hand with experimental neuroscience in order to help us understand how the mammalian brain works. As a case study we'll take a look at modeling the hippocampus, a central and essential structure in our daily dealings with reality. In part I of the series we first take a look at the hippocampus, its role in the brain, and what makes this particular structure so uniquely fascinating.

Modeling the hippocampus, part II: Hippocampal function.

In this multi-part series I'd like to give an introduction into how computational neuroscience can work hand in hand with experimental neuroscience. In part II of this series we take a look at some of the fundamental problems of understanding brain computations. In order to get an idea about hippocampal function we also talk about its involvement in human memory and how we came to know about it.

Modeling the hippocampus, part III: Spatial processing in the hippocampus.

In this multi-part series I'd like to give an introduction into how computational neuroscience can work hand in hand with experimental neuroscience to understand the mammalian hippocampus. In this third part of the series we take a look at the role of the hippocampus in spatial processing in rodents to get a better idea of the computation the hippocampus provides our brains with.

The Institut für Neuroinformatik (INI) is a central research unit of the Ruhr-Universität Bochum. We aim to understand the fundamental principles through which organisms generate behavior and cognition while linked to their environments through sensory systems and while acting in those environments through effector systems. Inspired by our insights into such natural cognitive systems, we seek new solutions to problems of information processing in artificial cognitive systems. We draw from a variety of disciplines that include experimental approaches from psychology and neurophysiology as well as theoretical approaches from physics, mathematics, electrical engineering and applied computer science.

Universitätsstr. 150, Building NB, Room 3/32
D-44801 Bochum, Germany

Tel: (+49) 234 32-28967
Fax: (+49) 234 32-14210