Theory of Neural Systems

What we do

We are an interdisciplinary research group focusing on principles of self-organization in neural systems, ranging from artificial neural networks to the hippocampus. By bringing together machine learning and computational neuroscience we explore ways of extracting representations from data that are useful for goal-directed learning, such as reinforcement learning.

Unsupervised Learning with Slow Feature Analysis

The goal of unsupervised learning algorithms is to learn a structure in data. This has the benefit of massively decreasing complexities of data-driven problems which would otherwise not be feasible to solve. A learning objective needs to be determined to guide this search of structure in the data. Our group is primarily interested in the learning algorithms defined by the slowness meta-learning objective: the most interesting things are the most slowly changing over time. A concrete method of realizing this is with the slow feature analysis algorithm, which extracts a set number of slowly changing features from a data set. As an example, this can identify from a wildlife video the location and angle of a present fish while discarding unimportant information, such as the value of every pixel at each time-frame.

Memory-based Reinforcement Learning

In recent years, we have seen machines becoming capable of solving tasks of increasing complexity without explicit instruction. For example, a single neural network outperformed several human experts in a selection of classical video games (Atari 2600) based on visual input. Interactive learning problems - an agent interacting with an environment - can be cast into the theoretical framework of "Reinforcement Learning", which is concerned with ways of maximizing accumulated rewards from such a setting. Our main interest lies in finding optimal behavior in visually complex environments, in which incompleteness of information is a common issue. One focus of our research lies in the design of algorithms that can use memory to overcome this problem by aggregating partial information from a history of observations.

System Level Modeling of the Hippocampus

The hippocampus is a central processing structure in the mammalian brain. In humans it is tightly linked to episodic memory; without it we would not be able to acquire new memories. In rodents the hippocampus has been found to house a neural encoding of the surrounding space in the form of place cells, grid cells, and head direction cells.

In our group we develop computational models for both of these aspects of the hippocampus in an effort to learn how these two fundamental functions interact with each other as well as how the underlying neuronal processes are capable of performing both of them in the same structure.

Our Mission Statement

Career: Help young scientists realize their potential in science and society.
Research: Discover principles of self-organization of intelligent systems.
Education: Teach creativity through mathematical intuition.
Society: Contribute to an informed discussion about artificial intelligence.
Technology: Support technological progress by sharing our expertise.
2019

Learning gradient-based ICA by neurally estimating mutual information

Several methods of estimating the mutual information of random variables have been developed in recent years. They can prove valuable for novel approaches to learning statistically independent features. In this paper, we use one of these methods, a mutual information neural estimation (MINE) network, to present a proof-of-concept of how a neural network can perform linear ICA. We minimize the mutual information,as estimated by a MINE network, between the output units of a differentiable encoder network. This is done by simple alternate optimization of the two networks. The method is shown to get a qualitatively equal solution to FastICA on blind-source-separation of noisy sources.

2018

Measuring the Data Efficiency of Deep Learning Methods

We propose a new experimental protocol and use it to benchmark the data efficiency — performance as a function of training set size — of two deep learning algorithms, convolutional neural networks (CNNs) and hierarchical information-preserving graph-based slow feature analysis (HiGSFA), for tasks in classification and transfer learning scenarios.

2010

Ongoing project by Stefan Richthofer: Predictable Feature Analysis

To apply Slow Feature Analysis (SFA) to interactive scenarios it needs to deal with a control signal. Predictability is a crucial property of features involving control and this project deals with Predictable Feature Analysis (PFA): an SFA-inspired approach to extract predictable features and leverage them to solve continuous control tasks.

    in press

  • Learning Cognitive Map Representations for Navigation by Sensorimotor Integration
    Zhao, D., Zhang, Z., Lu, H., Cheng, S., Si, B., & Feng, X.
    IEEE Transactions on Cybernetics
  • 2020

  • Improving sensory representations using episodic memory
    Görler, R., Wiskott, L., & Cheng, S.
    Hippocampus, 30(6), 638–656
  • Trial-by-trial dynamics of reward prediction error associated signals during extinction learning and renewal
    Packheiser, J., Donoso, J. R., Cheng, S., Güntürkün, O., & Pusch, R.
    Progress in Neurobiology, 101901
  • The Hessian Estimation Evolution Strategy
    Glasmachers, T., & Krause, O.
    In Parallel Problem Solving from Nature (PPSN XVII) Springer
  • Convergence Analysis of the Hessian Estimation Evolution Strategy
    Glasmachers, T., & Krause, O.
    arxiv.org
  • Analyzing Reinforcement Learning Benchmarks with Random Weight Guessing
    Oller, D., Cuccu, G., & Glasmachers, T.
    In International Conference on Autonomous Agents and Multi-Agent Systems
  • AI for Social Good: Unlocking the Opportunity for Positive Impact
    Tomašev, N., Cornebise, J., Hutter, F., Picciariello, A., Connelly, B., Belgrave, D. C. M., et al.
    Nature Communications, (2468)
  • 2019

  • Improved graph-based SFA: information preservation complements the slowness principle
    Escalante-B., A. N., & Wiskott, L.
    Machine Learning
  • Hippocampal Reactivation Extends for Several Hours Following Novel Experience
    Giri, B., Miyawaki, H., Mizuseki, K., Cheng, S., & Diba, K.
    The Journal of Neuroscience, 39(5), 866–875
  • Moment Vector Encoding of Protein Sequences for Supervised Classification
    Altartouri, H., & Glasmachers, T.
    In Practical Applications of Computational Biology and Bioinformatics, 13th International Conference (pp. 25–35) Springer International Publishing
  • Emerging category representation in the visual forebrain hierarchy of pigeons (Columba livia)
    Azizi, A. H., Pusch, R., Koenen, C., Klatt, S., Bröcker, F., Thiele, S., et al.
    Behavioural Brain Research, 356, 423–434
  • A Parallel RatSlam C++ Library Implementation
    de Souza Muñoz, M. E., Menezes, M. C., de Freitas, E. P., Cheng, S., de Almeida Neto, A., de Oliveira, A. C. M., & de Almeida Ribeiro, P. R.
    In Communications in Computer and Information Science (pp. 173–183) Springer International Publishing
  • Challenges of Convex Quadratic Bi-objective Benchmark Problems
    Glasmachers, T.
    In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO) (pp. 559–567) ACM
  • Global Convergence of the (1+1) Evolution Strategy
    Glasmachers, T.
    Evolutionary Computation Journal (ECJ)
  • Boosting Reinforcement Learning with Unsupervised Feature Extraction
    Hakenes, S., & Glasmachers, T.
    In Artificial Neural Networks and Machine Learning – ICANN 2019: Theoretical Neural Computation (pp. 555–566) Springer International Publishing
  • How do memory modules differentially contribute to familiarity and recollection?
    Hakobyan, O., & Cheng, S.
    Behavioral and Brain Sciences, 42, e288
  • Measuring the Data Efficiency of Deep Learning Methods
    Hlynsson, H., Escalante-B., A., & Wiskott, L.
    In Proceedings of the 8th International Conference on Pattern Recognition Applications and Methods SCITEPRESS - Science and Technology Publications
  • Learning Gradient-Based ICA by Neurally Estimating Mutual Information
    Hlynsson, H. D. ′ið, & Wiskott, L.
    In C. Benzmüller & Stuckenschmidt, H. (Eds.), KI 2019: Advances in Artificial Intelligence (pp. 182–187) Cham: Springer International Publishing
  • Vehicle Shape and Color Classification Using Convolutional NeuralNetwork
    Nafzi, M., Brauckmann, M., & Glasmachers, T.
    arxiv.org
  • Dual SVM Training on a Budget
    Qaadan, S., Schüler, M., & Glasmachers, T.
    In Proceedings of the 8th International Conference on Pattern Recognition Applications and Methods SCITEPRESS - Science and Technology Publications
  • Modeling Macroscopic Material Behavior With Machine Learning Algorithms Trained by Micromechanical Simulations
    Reimann, D., Nidadavolu, K., ul Hassan, H., Vajragupta, N., Glasmachers, T., Junker, P., & Hartmaier, A.
    Frontiers in Materials, 6, 181
  • SpikeDeeptector: A deep-learning based method for detection of neural spiking activity
    Saif-ur-Rehman, M., Lienkämper, R., Parpaley, Y., Wellmer, J., Liu, C., Lee, B., et al.
    Journal of Neural Engineering
  • Gradient-based Training of Slow Feature Analysis by Differentiable Approximate Whitening
    Schüler, M., Hlynsson, H. D. ′ið, & Wiskott, L.
    In W. S. Lee & Suzuki, T. (Eds.), Proceedings of The Eleventh Asian Conference on Machine Learning (Vol. 101, pp. 316–331) Nagoya, Japan: PMLR
  • 2018

  • A Neuro-Inspired Approach to Solve a Simultaneous Location and Mapping Task Using Shared Information in Multiple Robots Systems
    Menezes, M. C., de Freitas, E. P., Cheng, S., de Oliveira, A. C. M., & de Almeida Ribeiro, P. R.
    In 2018 15th International Conference on Control, Automation, Robotics and Vision (ICARCV) IEEE
  • Autonomous Exploration Guided by Optimisation Metaheuristic
    Santos, R. G., de Freitas, E. P., Cheng, S., de Almeida Ribeiro, P. R., & de Oliveira, A. C. M.
    In 2018 15th International Conference on Control, Automation, Robotics and Vision (ICARCV) IEEE
  • Storage fidelity for sequence memory in the hippocampal circuit
    Bayati, M., Neher, T., Melchior, J., Diba, K., Wiskott, L., & Cheng, S.
    PLOS ONE, 13(10), e0204685
  • Utilizing Slow Feature Analysis for Lipreading
    Freiwald, J., Karbasi, M., Zeiler, S., Melchior, J., Kompella, V., Wiskott, L., & Kolossa, D.
    In Speech Communication; 13th ITG-Symposium (pp. 191–195) VDE Verlag GmbH
  • The reduction of adult neurogenesis in depression impairs the retrieval of new as well as remote episodic memory
    Fang, J., Demic, S., & Cheng, S.
    PLOS ONE, 13(6), e0198406
  • The Interaction between Semantic Representation and Episodic Memory
    Fang, J., Rüther, N., Bellebaum, C., Wiskott, L., & Cheng, S.
    Neural Computation, 30(2), 293–332
  • Drift Theory in Continuous Search Spaces: Expected Hitting Time of the (1+1)-ES with 1/5 Success Rule
    Akimoto, Y., Auger, A., & Glasmachers, T.
    In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO) ACM
  • Speeding Up Budgeted Dual SVM Training with Precomputed GSS
    Glasmachers, T., & Qaadan, S.
    (M. M. -y-G. Ruben Vera-Rodriguez Sergio Velastin & Morales, A., Eds.), The 23rd Iberoamerican Congress on Pattern Recognition
  • Speeding Up Budgeted Stochastic Gradient Descent SVM Training with Precomputed Golden Section Search
    Glasmachers, T., & Qaadan, S.
    In G. Nicosia, Pardalos, P., Giuffrida, G., Umeton, R., & Sciacca, V. (Eds.), The 4th International Conference on machine Learning, Optimization and Data science - LOD 2018
  • Large Scale Black-box Optimization by Limited-Memory Matrix Adaptation
    Loshchilov, I., Glasmachers, T., & Beyer, H. -G.
    IEEE Transactions on Evolutionary Computation, 99
  • Challenges in High-dimensional Controller Design with Evolution Strategies
    Müller, N., & Glasmachers, T.
    In Parallel Problem Solving from Nature (PPSN XVI) Springer
  • Multi-Merge Budget Maintenance for Stochastic Gradient Descent SVM Training
    Qaadan, S., & Glasmachers, T.
    13th WiML Workshop, Co-located with NeurIPS, Montreal, QC, Canada
  • Multi-Merge Budget Maintenance for Stochastic Gradient Descent SVM Training
    Qaadan, S., & Glasmachers, T.
    arXiv.org
  • Multi-Merge Budget Maintenance for Stochastic Coordinate Ascent SVM Training
    Qaadan, S., & Glasmachers, T.
    Artificial Intelligence International Conference – A2IC 2018
  • User-Centered Development of a Pedestrian Assistance System Using End-to-End Learning
    Qureshi, H. S., Glasmachers, T., & Wiczorek, R.
    In 17th IEEE International Conference on Machine Learning and Applications (ICMLA) (pp. 808–813) IEEE
  • Global Navigation Using Predictable and Slow Feature Analysis in Multiroom Environments, Path Planning and Other Control Tasks
    Richthofer, S., & Wiskott, L.
    CoRR e-print arXiv:1805.08565
  • Slowness as a Proxy for Temporal Predictability: An Empirical Comparison
    Weghenkel, B., & Wiskott, L.
    Neural Computation, 30(5), 1151–1179
  • Doing without metarepresentation: Scenario construction explains the epistemic generativity and privileged status of episodic memory
    Werning, M., & Cheng, S.
    Behavioral and Brain Sciences, 41, e34
  • 2017

  • PFAx: Predictable Feature Analysis to Perform Control
    Richthofer, S., & Wiskott, L.
    ArXiv e-prints
  • Gaussian-binary restricted Boltzmann machines for modeling natural image statistics
    Melchior, J., Wang, N., & Wiskott, L.
    PLOS ONE, 12(2), 1–24
  • Generating sequences in recurrent neural networks for storing and retrieving episodic memories
    Bayati, M., Melchior, J., Wiskott, L., & Cheng, S.
    In Proc. 26th Annual Computational Neuroscience Meeting (CNS*2017): Part 2
  • Consolidation of Episodic Memory: An Epiphenomenon of Semantic Learning
    Cheng, S.
    In N. Axmacher & Rasch, B. (Eds.), Cognitive Neuroscience of Memory Consolidation (pp. 57–72) Cham, Switzerland: Springer International Publishing
  • Experience-Dependency of Reliance on Local Visual and Idiothetic Cues for Spatial Representations Created in the Absence of Distal Information
    Draht, F., Zhang, S., Rayan, A., Schönfeld, F., Wiskott, L., & Manahan-Vaughan, D.
    Frontiers in Behavioral Neuroscience, 11(92)
  • Extensions of Hierarchical Slow Feature Analysis for Efficient Classification and Regression on High-Dimensional Data
    Escalante-B., A. N.
    Doctoral thesis, Ruhr University Bochum, Faculty of Electrical Engineering and Information Technology
  • A Fast Incremental BSP Tree Archive for Non-dominated Points
    Glasmachers, T.
    In Evolutionary Multi-Criterion Optimization (EMO) Springer
  • Limits of End-to-End Learning
    Glasmachers, T.
    In Proceedings of the 9th Asian Conference on Machine Learning (ACML)
  • Texture attribute synthesis and transfer using feed-forward CNNs
    Irmer, T., Glasmachers, T., & Maji, S.
    In Winter Conference on Applications of Computer Vision (WACV) IEEE
  • Intrinsically Motivated Acquisition of Modular Slow Features for Humanoids in Continuous and Non-Stationary Environments
    Kompella, V. R., & Wiskott, L.
    arXiv preprint arXiv:1701.04663
  • Qualitative and Quantitative Assessment of Step Size Adaptation Rules
    Krause, O., Glasmachers, T., & Igel, C.
    In Conference on Foundations of Genetic Algorithms (FOGA) ACM
  • From grid cells to place cells with realistic field sizes
    Neher, T., Azizi, A. H., & Cheng, S.
    PLoS ONE, 12(7), e0181618
  • Graph-based predictable feature analysis
    Weghenkel, B., Fischer, A., & Wiskott, L.
    Machine Learning, 1–22
  • Taxonomy and Unity of Memory
    Werning, M., & Cheng, S.
    In S. Bernecker & Michaelian, K. (Eds.), The Routledge Handbook of Philosophy of Memory (pp. 7–20) London: Routledge
  • 2016

  • Graph-based Predictable Feature Analysis
    Weghenkel, B., Fischer, A., & Wiskott, L.
    e-print arXiv:1602.00554v1
  • Topological Schemas of Cognitive Maps and Spatial Learning
    Babichev, A., Cheng, S., & Dabaghian, Y. A.
    Frontiers in Computational Neuroscience, 10, 18
  • What is episodic memory if it is a natural kind?
    Cheng, S., & Werning, M.
    Synthese, 193(5), 1345–1385
  • Dissociating memory traces and scenario construction in mental time travel
    Cheng, S., Werning, M., & Suddendorf, T.
    Neuroscience & Biobehavioral Reviews, 60, 82–89
  • Fast model selection by limiting SVM training times
    Demircioğlu, A., Horn, D., Glasmachers, T., Bischl, B., & Weihs, C.
    arxiv.org
  • A Unified View on Multi-class Support Vector Classification
    Doğan, Ü., Glasmachers, T., & Igel, C.
    Journal of Machine Learning Research, 17(45), 1–32
  • Theoretical analysis of the optimal free responses of graph-based SFA for the design of training graphs.
    Escalante-B., A. N., & Wiskott, L.
    Journal of Machine Learning Research, 17(157), 1–36
  • Improved graph-based SFA: Information preservation complements the slowness principle
    Escalante-B., A. N., & Wiskott, L.
    e-print arXiv:1601.03945
  • Finite Sum Acceleration vs. Adaptive Learning Rates for the Training of Kernel Machines on a Budget
    Glasmachers, T.
    In NIPS workshop on Optimization for Machine Learning
  • Small Stochastic Average Gradient Steps
    Glasmachers, T.
    In NIPS workshop on Optimizing the Optimizers
  • A Comparative Study on Large Scale Kernelized Support Vector Machines
    Horn, D., Demircioğlu, A., Bischl, B., Glasmachers, T., & Weihs, C.
    Advances in Data Analysis and Classification (ADAC), 1–17
  • Unbounded Population MO-CMA-ES for the Bi-Objective BBOB Test Suite
    Krause, O., Glasmachers, T., Hansen, N., & Igel, C.
    In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)
  • Multi-objective Optimization with Unbounded Solution Sets
    Krause, O., Glasmachers, T., & Igel, C.
    In NIPS workshop on Bayesian Optimization
  • Anytime Bi-Objective Optimization with a Hybrid Multi-Objective CMA-ES (HMO-CMA-ES)
    Loshchilov, I., & Glasmachers, T.
    In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)
  • How to Center Deep Boltzmann Machines
    Melchior, J., Fischer, A., & Wiskott, L.
    Journal of Machine Learning Research, 17(99), 1–61
  • A computational model of spatial encoding in the hippocampus
    Schönfeld, F.
    Doctoral thesis, Ruhr-Universität Bochum
  • Supervised Classification
    Weihs, C., & Glasmachers, T.
    In C. Weihs, Jannach, D., Vatolkin, I., & Rudolph, G. (Eds.), Music Data Analysis: Foundations and Applications
  • 2015

  • Using SFA and PFA to solve navigation tasks in multi room environments
    Richthofer, S.
    Weekly Seminar talk, Institut für Neuroinformatik, Ruhr-Universität Bochum, Apr 1st, 2015, Bochum, Germany
  • Self-organization of synchronous activity propagation in neuronal networks driven by local excitation
    Bayati, M., Valizadeh, A., Abbassian, A., & Cheng, S.
    Frontiers in Computational Neuroscience, 9, 69
  • Theoretical Analysis of the Optimal Free Responses of Graph-Based SFA for the Design of Training Graphs
    Escalante-B., A. N., & Wiskott, L.
    e-print arXiv:1509.08329 (Accepted in Journal of Machine Learning Research)
  • Continual curiosity-driven skill acquisition from high-dimensional video inputs for humanoid robots
    Kompella, V. R., Stollenga, M., Luciw, M., & Schmidhuber, J.
    Artificial Intelligence
  • A CMA-ES with Multiplicative Covariance Matrix Updates
    Krause, O., & Glasmachers, T.
    In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)
  • Memory Storage Fidelity in the Hippocampal Circuit: The Role of Subregions and Input Statistics
    Neher, T., Cheng, S., & Wiskott, L.
    PLoS Computational Biology, 11(5), e1004250
  • Predictable Feature Analysis
    Richthofer, S., & Laurenz Wiskott,
    In 14th IEEE International Conference on Machine Learning and Applications, ICMLA 2015, Miami, FL, USA, December 9-11, 2015 (pp. 190–196)
  • Predictable Feature Analysis
    Richthofer, S., & Wiskott, L.
    In Workshop New Challenges in Neural Computation 2015 (NC2) (pp. 68–75)
  • Predictable Feature Analysis.
    Richthofer, S., & Wiskott, L.
    In Workshop New Challenges in Neural Computation 2015 (NC2) (pp. 68–75)
  • Predictable Feature Analysis.
    Richthofer, S., & Wiskott, L.
    In 2015 IEEE 14th International Conference on Machine Learning and Applications (ICMLA) (pp. 190–196)
  • Modeling place field activity with hierarchical slow feature analysis
    Schoenfeld, F., & Wiskott, L.
    Frontiers in Computational Neuroscience, 9(51)
  • Modeling place field activity with hierarchical slow feature analysis
    Schönfeld, F., & Wiskott, L.
    frontiers in Computational Neuroscience, 9(51)
  • 2014

  • Modeling the Dynamics of Disease States in Depression
    Demic, S., & Cheng, S.
    PLOS ONE, 9(10), 1–14
  • The transformation from grid cells to place cells is robust to noise in the grid pattern
    Azizi, A. H., Schieferstein, N., & Cheng, S.
    Hippocampus, 24(8), 912–919
  • Slow Feature Analysis on Retinal Waves Leads to V1 Complex Cells.
    Dähne, S., Wilbert, N., & Wiskott, L.
    PLoS Comput Biol, 10(5), e1003564
  • Testing Hypotheses by Regularized Maximum Mean Discrepancy
    Danafar, S., Rancoita, P. M. V., Glasmachers, T., Whittingstall, K., & Schmidhuber, J.
    International Journal of Computer and Information Technology (IJCIT), 3(2)
  • Handling Sharp Ridges with Local Supremum Transformations
    Glasmachers, T.
    In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)
  • Optimized Approximation Sets for Low-dimensional Benchmark Pareto Fronts
    Glasmachers, T.
    In Parallel Problem Solving from Nature (PPSN) Springer
  • Start Small, Grow Big - Saving Multiobjective Function Evaluations
    Glasmachers, T., Naujoks, B., & Rudolph, G.
    In Parallel Problem Solving from Nature (PPSN) Springer
  • Slow Feature Analysis for Curiosity-Driven Agents, 2014 IEEE WCCI Tutorial
  • Slowness Learning for Curiosity-Driven Agents
    Kompella, V. R.
    Doctoral thesis, Università della svizzera italiana (USI)
  • An Anti-hebbian Learning Rule to Represent Drive Motivations for Reinforcement Learning
    Kompella, V. R., Kazerounian, S., & Schmidhuber, J.
    In From Animals to Animats 13 (pp. 176–187) Springer International Publishing
  • Explore to See, Learn to Perceive, Get the Actions for Free: SKILLABILITY
    Kompella, V. R., Stollenga, M. F., Luciw, M. D., & Schmidhuber, J.
    In Proceedings of IEEE Joint Conference of Neural Networks (IJCNN)
  • Parametric Anatomical Modeling: a method for modeling the anatomical layout of neurons and their projections
    Pyka, M., Klatt, S., & Cheng, S.
    Frontiers in Neuroanatomy, 8, 91
  • An Extension of Slow Feature Analysis for Nonlinear Blind Source Separation
    Sprekeler, H., Zito, T., & Wiskott, L.
    Journal of Machine Learning Research, 15, 921–947
  • Modeling correlations in spontaneous activity of visual cortex with Gaussian-binary deep Boltzmann machines
    Wang, N., Jancke, D., & Wiskott, L.
    In Proc. Bernstein Conference for Computational Neuroscience, Sep 3–5,Göttingen, Germany (pp. 263–264) BFNT Göttingen
  • Modeling correlations in spontaneous activity of visual cortex with centered Gaussian-binary deep Boltzmann machines
    Wang, N., Jancke, D., & Wiskott, L.
    In Proc. International Conference of Learning Representations (ICLR′14, workshop), Apr 14–16,Banff, Alberta, Canada
  • Gaussian-binary Restricted Boltzmann Machines on Modeling Natural Image statistics
    Wang, N., Melchior, J., & Wiskott, L.
    (Vol. 1401.5900) arXiv.org e-Print archive
  • Learning predictive partitions for continuous feature spaces
    Weghenkel, B., & Wiskott, L.
    In Proc. 22nd European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN), Apr 23-25, Bruges, Belgium (pp. 577–582)
  • Is Episodic Memory a Natural Kind?-A Defense of the Sequence Analysis
    Werning, M., & Cheng, S.
    In P. Bello, Guarini, M., McShane, M., & Scassellati, B. (Eds.), Proceedings of the 36th Annual Conference of the Cognitive Science Society (Vol. 2, pp. 964–69) Austin, TX: Cognitive Science Society
  • Natural Evolution Strategies
    Wierstra, D., Schaul, T., Glasmachers, T., Sun, Y., Peters, J., & Schmidhuber, J.
    Journal of Machine Learning Research, 15, 949–980
  • Elastic Bunch Graph Matching
    Wiskott, L., Würtz, R. P., & Westphal, G.
    Scholarpedia, 9, 10587
  • Spatial representations of place cells in darkness are supported by path integration and border information
    Zhang, S., Schoenfeld, F., Wiskott, L., & Manahan-Vaughan, D.
    Frontiers in Behavioral Neuroscience, 8(222)
  • 2013

  • Composition and replay of mnemonic sequences: The contributions of REM and slow-wave sleep to episodic memory
    Cheng, S., & Werning, M.
    Behavioral and Brain Sciences, 36(06), 610–611
  • A computational model for preplay in the hippocampus
    Azizi, A. H., Wiskott, L., & Cheng, S.
    Frontiers in Computational Neuroscience, 7, 161
  • A computational model for preplay in the hippocampus
    Azizi, A. H., Wiskott, L., & Cheng, S.
    Frontiers in Computational Neuroscience, 7(161), 1–15
  • The CRISP theory of hippocampal function in episodic memory
    Cheng, S.
    Frontiers in Neural Circuits, 7, 88
  • How to Solve Classification and Regression Problems on High-Dimensional Data with a Supervised Extension of Slow Feature Analysis
    Escalante-B., A. -N., & Wiskott, L.
    Cognitive Sciences EPrint Archive (CogPrints)
  • How to Solve Classification and Regression Problems on High-Dimensional Data with a Supervised Extension of Slow Feature Analysis
    Escalante-B., A. N., & Wiskott, L.
    Journal of Machine Learning Research, 14, 3683–3719
  • A Natural Evolution Strategy with Asynchronous Strategy Updates
    Glasmachers, T.
    In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)
  • The Planning-ahead SMO Algorithm
    Glasmachers, T.
    arxiv.org
  • Accelerated Coordinate Descent with Adaptive Coordinate Frequencies
    Glasmachers, T., & Doğan, Ü.
    In Proceedings of the fifth Asian Conference on Machine Learning (ACML)
  • Identification of two forebrain structures that mediate execution of memorized sequences in the pigeon
    Helduser, S., Cheng, S., & Güntürkün, O.
    Journal of Neurophysiology, 109(4), 958–968
  • Approximation properties of DBNs with binary hidden units and real-valued visible units
    Krause, O., Fischer, A., Glasmachers, T., & Igel, C.
    In Proceedings of the International Conference on Machine Learning (ICML)
  • Deep Hierarchies in the Primate Visual Cortex: What Can We Learn For Computer Vision?
    Krüger, N., Janssen, P., Kalkan, S., Lappe, M., Leonardis, A., Piater, J., et al.
    IEEE Trans. on Pattern Analysis and Machine Intelligence, 35(8), 1847–1871
  • An intrinsic value system for developing multiple invariant representations with incremental slowness learning
    Luciw*, M., Kompella*, V., Kazerounian, S., & Schmidhuber, J.
    Frontiers in neurorobotics, 7, *Joint first authors
  • How to Center Binary Restricted Boltzmann Machines
    Melchior, J., Fischer, A., Wang, N., & Wiskott, L.
    (Vol. 1311.1354) arXiv.org e-Print archive
  • Are memories really stored in the hippocampal CA3 region?
    Neher, T., Cheng, S., & Wiskott, L.
    BoNeuroMed
  • Are memories really stored in the hippocampal CA3 region?
    Neher, T., Cheng, S., & Wiskott, L.
    In Proc. 10th Göttinger Meeting of the German Neuroscience Society, Mar 13-16, Göttingen, Germany (p. 104)
  • Predictable Feature Analysis
    Richthofer, S., & Laurenz Wiskott,
    CoRR, abs/1311.2503
  • Predictable Feature Analysis
    Richthofer, S., & Wiskott, L.
    arXiv.org e-Print archive
  • RatLab: An easy to use tool for place code simulations
    Schoenfeld, F., & Wiskott, L.
    Frontiers in Computational Neuroscience, 7(104)
  • Modeling correlations in spontaneous activity of visual cortex with centered Gaussian-binary deep Boltzmann machines
    Wang, N., Jancke, D., & Wiskott, L.
    arXiv preprint arXiv:1312.6108
  • 2012

  • Constraints on the synchronization of entorhinal cortex stellate cells
    Crotty, P., Lasker, E., & Cheng, S.
    Phys. Rev. E, 86(1), 011908
  • Turning Binary Large-margin Bounds into Multi-class Bounds
    Doğan, Ü., Glasmachers, T., & Igel, C.
    In ICML workshop on RKHS and kernel-based methods
  • A Note on Extending Generalization Bounds for Binary Large-margin Classifiers to Multiple Classes
    Doğan, Ü., Glasmachers, T., & Igel, C.
    In Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases (ECML-PKDD)
  • Slow Feature Analysis: Perspectives for Technical Applications of a Versatile Learning Algorithm
    Escalante-B., A. N., & Wiskott, L.
    Künstliche Intelligenz [Artificial Intelligence], 26(4), 341–348
  • Convergence of the IGO-Flow of Isotropic Gaussian Distributions on Convex Quadratic Problems
    Glasmachers, T.
    In C. C. Coello, Cutello, V., Deb, K., Forrest, S., Nicosia, G., & Pavone, M. (Eds.), Parallel Problem Solving from Nature (PPSN) Springer
  • Kernel Representations for Evolving Continuous Functions
    Glasmachers, T., Koutník, J., & Schmidhuber, J.
    Journal of Evolutionary Intelligence, 5(3), 171–187
  • Incremental slow feature analysis: Adaptive low-complexity slow feature updating from high-dimensional input streams
    Kompella, V. R., Luciw, M., & Schmidhuber, J.
    Neural Computation, 24(11), 2994–3024
  • Autonomous learning of abstractions using curiosity-driven modular incremental slow feature analysis
    Kompella, V. R., Luciw, M., Stollenga, M., Pape, L., & Schmidhuber, J.
    In Development and Learning and Epigenetic Robotics (ICDL), 2012 IEEE International Conference on (pp. 1–8) IEEE
  • Collective-reward based approach for detection of semi-transparent objects in single images
    Kompella, V. R., & Sturm, P.
    Computer Vision and Image Understanding, 116(4), 484–499
  • Hierarchical incremental slow feature analysis
    Luciw, M., Kompella, V. R., & Schmidhuber, J.
    Workshop on Deep Hierarchies in Vision
  • Predictable Feature Analysis
    Richthofer, S., Weghenkel, B., & Wiskott, L.
    In Frontiers in Computational Neuroscience
  • Sensory integration of place and head-direction cells in a virtual environment
    Schönfeld, F., & Wiskott, L.
    Poster at NeuroVisionen 8, 26. Oct 2012, Aachen, Germany
  • Sensory integration of place and head-direction cells in a virtual environment
    Schönfeld, F., & Wiskott, L.
    Poster at the 8th FENS Forum of Neuroscience, Jul 14–18, Barcelona, Spain
  • An Analysis of Gaussian-Binary Restricted Boltzmann Machines for Natural Images
    Wang, N., Melchior, J., & Wiskott, L.
    In Proc. 20th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, Apr 25–27, Bruges, Belgium (pp. 287–292)
  • 2011

  • Slow feature analysis
    Wiskott, L., Berkes, P., Franzius, M., Sprekeler, H., & Wilbert, N.
    Scholarpedia, 6(4), 5282
  • Reactivation, Replay, and Preplay: How It Might All Fit Together
    Buhry, L., Azizi, A. H., & Cheng, S.
    Neural Plasticity, 2011, 1–11
  • The structure of networks that produce the transformation from grid cells to place cells
    Cheng, S., & Frank, L. M.
    Neuroscience , 197, 293–306
  • Novelty Restarts for Evolution Strategies
    Cuccu, G., Gomez, F., & Glasmachers, T.
    In Proceedings of the IEEE Congress on Evolutionary Computation (CEC) IEEE
  • Heuristic Evaluation of Expansions for Non-Linear Hierarchical Slow Feature Analysis.
    Escalante, A., & Wiskott, L.
    In Proc. The 10th Intl. Conf. on Machine Learning and Applications (ICMLA′11), Dec 18–21, Honolulu, Hawaii (pp. 133–138) IEEE Computer Society
  • Optimal Direct Policy Search
    Glasmachers, T., & Schmidhuber, J.
    In Proceedings of the 4th Conference on Artificial General Intelligence (AGI)
  • Artificial Curiosity for Autonomous Space Exploration
    Graziano, V., Glasmachers, T., Schaul, T., Pape, L., Cuccu, G., Leitner, J., & Schmidhuber, J.
    ACTA FUTURA
  • Incremental Slow Feature Analysis.
    Kompella, V. R., Luciw, M. D., & Schmidhuber, J.
    IJCAI, 11, 1354–1359
  • Autoincsfa and vision-based developmental learning for humanoid robots
    Kompella, V. R., Pape, L., Masci, J., Frank, M., & Schmidhuber, J.
    In Humanoid Robots (Humanoids), 2011 11th IEEE-RAS International Conference on (pp. 622–629) IEEE
  • Detection and avoidance of semi-transparent obstacles using a collective-reward based approach
    Kompella, V. R., & Sturm, P.
    In Robotics and Automation (ICRA), 2011 IEEE International Conference on (pp. 3469–3474) IEEE
  • High Dimensions and Heavy Tails for Natural Evolution Strategies
    Schaul, T., Glasmachers, T., & Schmidhuber, J.
    In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)
  • Coherence Progress: A Measure of Interestingness Based on Fixed Compressors
    Schaul, T., Pape, L., Glasmachers, T., Graziano, V., & Schmidhuber, J.
    In Proceedings of the 4th Conference on Artificial General Intelligence (AGI)
  • 2010

  • 3-SAT on CUDA: Towards a massively parallel SAT solver
    Meyer, Q., Schönfeld, F., Stamminger, M., & Wanka, R.
    In 2010 International Conference on High Performance Computing Simulation (pp. 306–313)
  • Building a Side Channel Based Disassembler
    Eisenbarth, T., Paar, C., & Weghenkel, B.
    In M. L. Gavrilova, Tan, C. J. K., & Moreno, E. D. (Eds.), Transactions on Computational Science X: Special Issue on Security in Computing, Part I (pp. 78–99) Berlin, Heidelberg: Springer Berlin Heidelberg
  • Gender and Age Estimation from Synthetic Face Images with Hierarchical Slow Feature Analysis.
    Escalante, A., & Wiskott, L.
    In International Conference on Information Processing and Management of Uncertainty in Knowledge-Based Systems (IPMU′10), Jun 28 – Jul 2, Dortmund
  • Universal Consistency of Multi-Class Support Vector Classification
    Glasmachers, T.
    In Advances in Neural Information Processing Systems (NIPS)
  • Maximum Likelihood Model Selection for 1-Norm Soft Margin SVMs with Multiple Parameters
    Glasmachers, T., & Igel, C.
    IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(8), 1522–1528
  • A Natural Evolution Strategy for Multi-Objective Optimization
    Glasmachers, T., Schaul, T., & Schmidhuber, J.
    In Parallel Problem Solving from Nature (PPSN) Springer
  • Exponential Natural Evolution Strategies
    Glasmachers, T., Schaul, T., Sun, Y., Wierstra, D., & Schmidhuber, J.
    In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)
  • Frontier Search
    Sun, Y., Glasmachers, T., Schaul, T., & Schmidhuber, J.
    In Proceedings of the 3rd Conference on Artificial General Intelligence (AGI)
  • 2008

  • New Experiences Enhance Coordinated Neural Activity in the Hippocampus
    Cheng, S., & Frank, L. M.
    Neuron , 57(2), 303–313
  • On related violating pairs for working set selection in SMO algorithms
    Glasmachers, T.
    In M. Verleysen (Ed.), Proceedings of the 16th European Symposium on Artificial Neural Networks (ESANN) d-side publications
  • Gradient Based Optimization of Support Vector Machines
    Glasmachers, T.
    Doctoral thesis, Fakultät für Mathematik, Ruhr-Universität Bochum, Germany
  • Second-Order SMO Improves SVM Online and Active Learning
    Glasmachers, T., & Igel, C.
    Neural Computation, 20(2), 374–382
  • Uncertainty Handling in Model Selection for Support Vector Machines
    Glasmachers, T., & Igel, C.
    In G. Rudolph, Jansen, T., Lucas, S., Poloni, C., & Beume, N. (Eds.), Parallel Problem Solving from Nature (PPSN) (pp. 185–194) Springer
  • Shark
    Igel, C., Heidrich-Meisner, V., & Glasmachers, T.
    Journal of Machine Learning Research, 9, 993–996
  • 2007

  • Calibration of Visually Guided Reaching Is Driven by Error-Corrective Learning and Internal Dynamics
    Cheng, S., & Sabes, P. N.
    Journal of Neurophysiology, 97(4), 3057–3069
  • Gradient-Based Optimization of Kernel-Target Alignment for Sequence Kernels Applied to Bacterial Gene Start Detection
    Igel, C., Glasmachers, T., Mersch, B., Pfeifer, N., & Meinicke, P.
    IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB), 4(2), 216–226
  • Evolutionary Optimization of Sequence Kernels for Detection of Bacterial Gene Starts
    Mersch, B., Glasmachers, T., Meinicke, P., & Igel, C.
    International Journal of Neural Systems, 17(5), 369–381
  • 2006

  • Analytical derivation of complex cell properties from the slowness principle
    Sprekeler, H., & Wiskott, L.
    In Proc. 2nd Bernstein Symposium for Computational Neuroscience, Oct 1–3, Berlin, Germany (p. 67) Bernstein Center for Computational Neuroscience (BCCN) Berlin
  • Modeling Sensorimotor Learning with Linear Dynamical Systems
    Cheng, S., & Sabes, P. N.
    Neural Computation, 18(4), 760–793
  • Degeneracy in Model Selection for SVMs with Radial Gaussian Kernel
    Glasmachers, T.
    In M. Verleysen (Ed.), Proceedings of the 14th European Symposium on Artificial Neural Networks (ESANN) d-side publications
  • Maximum-Gain Working Set Selection for Support Vector Machines
    Glasmachers, T., & Igel, C.
    Journal of Machine Learning Research, 7, 1437–1466
  • Evolutionary Optimization of Sequence Kernels for Detection of Bacterial Gene Starts
    Mersch, B., Glasmachers, T., Meinicke, P., & Igel, C.
    In Proceedings of the 16th International Conference on Artificial Neural Networks (ICANN) Springer-Verlag
  • Analytical derivation of complex cell properties from the slowness principle
    Sprekeler, H., & Wiskott, L.
    In Proc. Berlin Neuroscience Forum, Jun 8–10, Bad Liebenwalde, Germany (pp. 65–66) Berlin: Max-Delbrück-Centrum für Molekulare Medizin (MDC)
  • Analytical derivation of complex cell properties from the slowness principle
    Sprekeler, H., & Wiskott, L.
    In Proc. 15th Annual Computational Neuroscience Meeting (CNS′06), Jul 16–20, Edinburgh, Scotland
  • 2005

  • Gradient-based Adaptation of General Gaussian Kernels
    Glasmachers, T., & Igel, C.
    Neural Computation, 17(10), 2099–2105
  • 2004

  • Statistical and dynamic models of charge balance functions
    Cheng, S., Petriconi, S., Pratt, S., Skoby, M., Gale, C., Jeon, S., et al.
    Phys. Rev. C, 69(5), 054906
  • 2003

  • Removing distortions from charge balance functions
    Pratt, S., & Cheng, S.
    Phys. Rev. C, 68(1), 014907
  • Isospin fluctuations from a thermally equilibrated hadron gas
    Cheng, S., & Pratt, S.
    Phys. Rev. C, 67(4), 044904
  • 2002

  • Statistical physics in a finite volume with absolute conservation laws
  • Modeling Relativistic Heavy Ion Collisions
  • Effect of finite-range interactions in classical transport theory
    Cheng, S., Pratt, S., Csizmadia, P., Nara, Y., Molnár, D., Gyulassy, M., et al.
    Phys. Rev. C, 65(2), 024901
  • 2001

  • Quantum corrections for pion correlations involving resonance decays
    Cheng, S., & Pratt, S.
    Phys. Rev. C, 63(5), 054904

A brief introduction to Slow Feature Analysis

One of the main research topics of the TNS group is called Slow Feature Analysis. Slow feature analysis (SFA) is an unsupervised learning method to extract the slowest or smoothest underlying functions or features from a time series. This can be used for dimensionality reduction, regression and classification. In this post we will provide a code example where SFA is applied, to help motivate the method. Then we will go into more detail about the math behind the method and finally provide links to other good resources on the material.

An extension to Slow Feature Analysis (xSFA)

Following our previous tutorial on Slow Feature Analysis (SFA) we now talk about xSFA - an unsupervised learning algorithm and extension to the original SFA algorithm that utilizes the slow features generated by SFA to reconstruct the individual sources of a nonlinear mixture, a process also known as Blind Source Separation (e.g. the reconstruction of individual voices from the recording of a conversation between multiple people). In this tutorial, we will provide a short example to demonstrate the capabilities of xSFA, discuss its limits, and offer some pointers on how and when to apply it. We also take a closer look at the theoretical background of xSFA to provide an intuition for the mathematics behind it.

Modeling the hippocampus, part I: Why the hippcampus?

In this multi-part series I'd like to give an introduction into how computational neuroscience can work hand in hand with experimental neuroscience in order to help us understand how the mammalian brain works. As a case study we'll take a look at modeling the hippocampus, a central and essential structure in our daily dealings with reality. In part I of the series we first take a look at the hippocampus, its role in the brain, and what makes this particular structure so uniquely fascinating.

Modeling the hippocampus, part II: Hippocampal function.

In this multi-part series I'd like to give an introduction into how computational neuroscience can work hand in hand with experimental neuroscience. In part II of this series we take a look at some of the fundamental problems of understanding brain computations. In order to get an idea about hippocampal function we also talk about its involvement in human memory and how we came to know about it.

Modeling the hippocampus, part III: Spatial processing in the hippocampus.

In this multi-part series I'd like to give an introduction into how computational neuroscience can work hand in hand with experimental neuroscience to understand the mammalian hippocampus. In this third part of the series we take a look at the role of the hippocampus in spatial processing in rodents to get a better idea of the computation the hippocampus provides our brains with.

Institute for Neural Computation

Department of Mathematics

Department of Electrical Engineering

The Institut für Neuroinformatik (INI) is a central research unit of the Ruhr-Universität Bochum. We aim to understand the fundamental principles through which organisms generate behavior and cognition while linked to their environments through sensory systems and while acting in those environments through effector systems. Inspired by our insights into such natural cognitive systems, we seek new solutions to problems of information processing in artificial cognitive systems. We draw from a variety of disciplines that include experimental approaches from psychology and neurophysiology as well as theoretical approaches from physics, mathematics, electrical engineering and applied computer science, in particular machine learning, artificial intelligence, and computer vision.

Universitätsstr. 150, Building NB, Room 3/32
D-44801 Bochum, Germany

Tel: (+49) 234 32-28967
Fax: (+49) 234 32-14210