Theory of Neural Systems

This group focuses on principles of self-organization in neural systems, mainly the visual system but also the hippocampus. It also has a side track in signal processing, e.g. blind source separation.

    2018

  • The Interaction between Semantic Representation and Episodic Memory
    Fang, J., Rüther, N., Bellebaum, C., Wiskott, L., & Cheng, S.
    Neural Computation, 30(2), 293–332
  • Challenges in High-dimensional Controller Design with Evolution Strategies
    Müller, N., & Glasmachers, T.
    In Parallel Problem Solving from Nature (PPSN XVI) Springer
  • Slowness as a Proxy for Temporal Predictability: An Empirical Comparison
    Weghenkel, B., & Wiskott, L.
    Neural Computation, 30(5), 1151–1179
  • 2017

  • Gaussian-binary restricted Boltzmann machines for modeling natural image statistics
    Melchior, J., Wang, N., & Wiskott, L.
    PLOS ONE, 12(2), 1–24
  • Experience-Dependency of Reliance on Local Visual and Idiothetic Cues for Spatial Representations Created in the Absence of Distal Information
    Draht, F., Zhang, S., Rayan, A., Schönfeld, F., Wiskott, L., & Manahan-Vaughan, D.
    Frontiers in Behavioral Neuroscience, 11(92)
  • Extensions of Hierarchical Slow Feature Analysis for Efficient Classification and Regression on High-Dimensional Data
    Escalante-B., A. N.
    Doctoral thesis, Ruhr University Bochum, Faculty of Electrical Engineering and Information Technology
  • Intrinsically Motivated Acquisition of Modular Slow Features for Humanoids in Continuous and Non-Stationary Environments
    Kompella, V. R., & Wiskott, L.
    arXiv preprint arXiv:1701.04663
  • Graph-based predictable feature analysis
    Weghenkel, B., Fischer, A., & Wiskott, L.
    Machine Learning, 1–22
  • 2016

  • Theoretical analysis of the optimal free responses of graph-based SFA for the design of training graphs.
    Escalante-B., A. N., & Wiskott, L.
    Journal of Machine Learning Research, 17(157), 1–36
  • Improved graph-based SFA: Information preservation complements the slowness principle
    Escalante-B., A. N., & Wiskott, L.
    e-print arXiv:1601.03945
  • How to Center Deep Boltzmann Machines
    Melchior, J., Fischer, A., & Wiskott, L.
    Journal of Machine Learning Research, 17(99), 1–61
  • A computational model of spatial encoding in the hippocampus
    Schönfeld, F.
    Doctoral thesis, Ruhr-Universität Bochum
  • 2015

  • Theoretical Analysis of the Optimal Free Responses of Graph-Based SFA for the Design of Training Graphs
    Escalante-B., A. N., & Wiskott, L.
    e-print arXiv:1509.08329 (Accepted in Journal of Machine Learning Research)
  • Continual curiosity-driven skill acquisition from high-dimensional video inputs for humanoid robots
    Kompella, V. R., Stollenga, M., Luciw, M., & Schmidhuber, J.
    Artificial Intelligence
  • Memory Storage Fidelity in the Hippocampal Circuit: The Role of Subregions and Input Statistics
    Neher, T., Cheng, S., & Wiskott, L.
    PLoS Computational Biology, 11(5), e1004250
  • Memory Storage Fidelity in the Hippocampal Circuit: The Role of Subregions and Input Statistics
    Neher, T., Cheng, S., & Wiskott, L.
    PLoS Comput Biol, 11, e1004250
  • Predictable Feature Analysis.
    Richthofer, S., & Wiskott, L.
    In Workshop New Challenges in Neural Computation 2015 (NC2) (pp. 68–75)
  • Predictable Feature Analysis.
    Richthofer, S., & Wiskott, L.
    In 2015 IEEE 14th International Conference on Machine Learning and Applications (ICMLA) (pp. 190–196)
  • Modeling place field activity with hierarchical slow feature analysis
    Schoenfeld, F., & Wiskott, L.
    Frontiers in Computational Neuroscience, 9(51)
  • Modeling place field activity with hierarchical slow feature analysis
    Schönfeld, F., & Wiskott, L.
    frontiers in Computational Neuroscience, 9(51)
  • 2014

  • Slow Feature Analysis on Retinal Waves Leads to V1 Complex Cells.
    Dähne, S., Wilbert, N., & Wiskott, L.
    PLoS Comput Biol, 10(5), e1003564
  • Slow Feature Analysis for Curiosity-Driven Agents, 2014 IEEE WCCI Tutorial
  • Slowness Learning for Curiosity-Driven Agents
    Kompella, V. R.
    Doctoral thesis, Università della svizzera italiana (USI)
  • An Anti-hebbian Learning Rule to Represent Drive Motivations for Reinforcement Learning
    Kompella, V. R., Kazerounian, S., & Schmidhuber, J.
    In From Animals to Animats 13 (pp. 176–187) Springer International Publishing
  • Explore to See, Learn to Perceive, Get the Actions for Free: SKILLABILITY
    Kompella, V. R., Stollenga, M. F., Luciw, M. D., & Schmidhuber, J.
    In Proceedings of IEEE Joint Conference of Neural Networks (IJCNN)
  • An Extension of Slow Feature Analysis for Nonlinear Blind Source Separation
    Sprekeler, H., Zito, T., & Wiskott, L.
    Journal of Machine Learning Research, 15, 921–947
  • Modeling correlations in spontaneous activity of visual cortex with Gaussian-binary deep Boltzmann machines
    Wang, N., Jancke, D., & Wiskott, L.
    In Proc. Bernstein Conference for Computational Neuroscience, Sep 3–5,Göttingen, Germany (pp. 263–264) BFNT Göttingen
  • Modeling correlations in spontaneous activity of visual cortex with centered Gaussian-binary deep Boltzmann machines
    Wang, N., Jancke, D., & Wiskott, L.
    In Proc. International Conference of Learning Representations (ICLR′14, workshop), Apr 14–16,Banff, Alberta, Canada
  • Gaussian-binary Restricted Boltzmann Machines on Modeling Natural Image statistics
    Wang, N., Melchior, J., & Wiskott, L.
    (Vol. 1401.5900) arXiv.org e-Print archive
  • Learning predictive partitions for continuous feature spaces
    Weghenkel, B., & Wiskott, L.
    In Proc. 22nd European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN), Apr 23-25, Bruges, Belgium (pp. 577–582)
  • Elastic Bunch Graph Matching
    Wiskott, L., Würtz, R. P., & Westphal, G.
    Scholarpedia, 9, 10587
  • Spatial representations of place cells in darkness are supported by path integration and border information
    Zhang, S., Schoenfeld, F., Wiskott, L., & Manahan-Vaughan, D.
    Frontiers in Behavioral Neuroscience, 8(222)
  • 2013

  • A computational model for preplay in the hippocampus
    Azizi, A. H., Wiskott, L., & Cheng, S.
    Frontiers in Computational Neuroscience, 7, 161
  • A computational model for preplay in the hippocampus
    Azizi, A. H., Wiskott, L., & Cheng, S.
    Frontiers in Computational Neuroscience, 7(161), 1–15
  • How to Solve Classification and Regression Problems on High-Dimensional Data with a Supervised Extension of Slow Feature Analysis
    Escalante-B., A. -N., & Wiskott, L.
    Cognitive Sciences EPrint Archive (CogPrints)
  • How to Solve Classification and Regression Problems on High-Dimensional Data with a Supervised Extension of Slow Feature Analysis
    Escalante-B., A. N., & Wiskott, L.
    Journal of Machine Learning Research, 14, 3683–3719
  • Deep Hierarchies in the Primate Visual Cortex: What Can We Learn For Computer Vision?
    Krüger, N., Janssen, P., Kalkan, S., Lappe, M., Leonardis, A., Piater, J., et al.
    IEEE Trans. on Pattern Analysis and Machine Intelligence, 35(8), 1847–1871
  • An intrinsic value system for developing multiple invariant representations with incremental slowness learning
    Luciw*, M., Kompella*, V., Kazerounian, S., & Schmidhuber, J.
    Frontiers in neurorobotics, 7, *Joint first authors
  • How to Center Binary Restricted Boltzmann Machines
    Melchior, J., Fischer, A., Wang, N., & Wiskott, L.
    (Vol. 1311.1354) arXiv.org e-Print archive
  • Are memories really stored in the hippocampal CA3 region?
    Neher, T., Cheng, S., & Wiskott, L.
    BoNeuroMed
  • Are memories really stored in the hippocampal CA3 region?
    Neher, T., Cheng, S., & Wiskott, L.
    In Proc. 10th Göttinger Meeting of the German Neuroscience Society, Mar 13-16, Göttingen, Germany (p. 104)
  • Predictable Feature Analysis
    Richthofer, S., & Wiskott, L.
    arXiv.org e-Print archive
  • RatLab: An easy to use tool for place code simulations
    Schoenfeld, F., & Wiskott, L.
    Frontiers in Computational Neuroscience, 7(104)
  • Modeling correlations in spontaneous activity of visual cortex with centered Gaussian-binary deep Boltzmann machines
    Wang, N., Jancke, D., & Wiskott, L.
    arXiv preprint arXiv:1312.6108
  • 2012

  • Slow Feature Analysis: Perspectives for Technical Applications of a Versatile Learning Algorithm
    Escalante-B., A. N., & Wiskott, L.
    Künstliche Intelligenz [Artificial Intelligence], 26(4), 341–348
  • Incremental slow feature analysis: Adaptive low-complexity slow feature updating from high-dimensional input streams
    Kompella, V. R., Luciw, M., & Schmidhuber, J.
    Neural Computation, 24(11), 2994–3024
  • Autonomous learning of abstractions using curiosity-driven modular incremental slow feature analysis
    Kompella, V. R., Luciw, M., Stollenga, M., Pape, L., & Schmidhuber, J.
    In Development and Learning and Epigenetic Robotics (ICDL), 2012 IEEE International Conference on (pp. 1–8) IEEE
  • Collective-reward based approach for detection of semi-transparent objects in single images
    Kompella, V. R., & Sturm, P.
    Computer Vision and Image Understanding, 116(4), 484–499
  • Hierarchical incremental slow feature analysis
    Luciw, M., Kompella, V. R., & Schmidhuber, J.
    Workshop on Deep Hierarchies in Vision
  • Sensory integration of place and head-direction cells in a virtual environment
    Schönfeld, F., & Wiskott, L.
    Poster at NeuroVisionen 8, 26. Oct 2012, Aachen, Germany
  • Sensory integration of place and head-direction cells in a virtual environment
    Schönfeld, F., & Wiskott, L.
    Poster at the 8th FENS Forum of Neuroscience, Jul 14–18, Barcelona, Spain
  • An Analysis of Gaussian-Binary Restricted Boltzmann Machines for Natural Images
    Wang, N., Melchior, J., & Wiskott, L.
    In Proc. 20th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, Apr 25–27, Bruges, Belgium (pp. 287–292)
  • 2011

  • Heuristic Evaluation of Expansions for Non-Linear Hierarchical Slow Feature Analysis.
    Escalante, A., & Wiskott, L.
    In Proc. The 10th Intl. Conf. on Machine Learning and Applications (ICMLA′11), Dec 18–21, Honolulu, Hawaii (pp. 133–138) IEEE Computer Society
  • Incremental Slow Feature Analysis.
    Kompella, V. R., Luciw, M. D., & Schmidhuber, J.
    IJCAI, 11, 1354–1359
  • Autoincsfa and vision-based developmental learning for humanoid robots
    Kompella, V. R., Pape, L., Masci, J., Frank, M., & Schmidhuber, J.
    In Humanoid Robots (Humanoids), 2011 11th IEEE-RAS International Conference on (pp. 622–629) IEEE
  • Detection and avoidance of semi-transparent obstacles using a collective-reward based approach
    Kompella, V. R., & Sturm, P.
    In Robotics and Automation (ICRA), 2011 IEEE International Conference on (pp. 3469–3474) IEEE
  • 2010

  • 3-SAT on CUDA: Towards a massively parallel SAT solver
    Meyer, Q., Schönfeld, F., Stamminger, M., & Wanka, R.
    In 2010 International Conference on High Performance Computing Simulation (pp. 306–313)
  • Building a Side Channel Based Disassembler
    Eisenbarth, T., Paar, C., & Weghenkel, B.
    In M. L. Gavrilova, Tan, C. J. K., & Moreno, E. D. (Eds.), Transactions on Computational Science X: Special Issue on Security in Computing, Part I (pp. 78–99) Berlin, Heidelberg: Springer Berlin Heidelberg
  • Gender and Age Estimation from Synthetic Face Images with Hierarchical Slow Feature Analysis.
    Escalante, A., & Wiskott, L.
    In International Conference on Information Processing and Management of Uncertainty in Knowledge-Based Systems (IPMU′10), Jun 28 – Jul 2, Dortmund

A brief introduction to Slow Feature Analysis

One of the main research topics of the TNS group is called Slow Feature Analysis. Slow feature analysis (SFA) is an unsupervised learning method to extract the slowest or smoothest underlying functions or features from a time series. This can be used for dimensionality reduction, regression and classification. In this post we will provide a code example where SFA is applied, to help motivate the method. Then we will go into more detail about the math behind the method and finally provide links to other good resources on the material.

An extension to Slow Feature Analysis (xSFA)

Following our previous tutorial on Slow Feature Analysis (SFA) we now talk about xSFA - an unsupervised learning algorithm and extension to the original SFA algorithm that utilizes the slow features generated by SFA to reconstruct the individual sources of a nonlinear mixture, a process also known as Blind Source Separation (e.g. the reconstruction of individual voices from the recording of a conversation between multiple people). In this tutorial, we will provide a short example to demonstrate the capabilities of xSFA, discuss its limits, and offer some pointers on how and when to apply it. We also take a closer look at the theoretical background of xSFA to provide an intuition for the mathematics behind it.

The Institut für Neuroinformatik (INI) is a central research unit of the Ruhr-Universität Bochum. We aim to understand the fundamental principles through which organisms generate behavior and cognition while linked to their environments through sensory systems and while acting in those environments through effector systems. Inspired by our insights into such natural cognitive systems, we seek new solutions to problems of information processing in artificial cognitive systems. We draw from a variety of disciplines that include experimental approaches from psychology and neurophysiology as well as theoretical approaches from physics, mathematics, electrical engineering and applied computer science.

Universitätsstr. 150, Building NB, Room 3/32
D-44801 Bochum, Germany

Tel: (+49) 234 32-28967
Fax: (+49) 234 32-14210