INI logo

List of Publications of the Research Group of
Prof. Dr. Laurenz Wiskott

RUB logo

This list contains references to publications by Laurenz Wiskott and members of his group while they were working with him. Student projects are included only in exceptional cases.

You can limit the list by search terms. You can use the Global QuickSearch or simply enter a search term into the respective fields on top of any column. If you don't want QuickSearch to search in abstracts (where available) as well, you can disable this via the 'Search Settings' button to the right. Note that you can use regular expressions in your search if you want to. For instance to search for entries between 1990 and 1993, type '199[0-3]' in the Global QuickSearch, or for entries written by either Aimone or Zito type 'Aimone|Zito' in the author search field.

Key publications have an abstract. Publications without an abstract are largely redundant. You can easily select only the key publications by putting 'Abstract' (without the quotes) in the Title search field.

You can also sort all entries in ascending or descending order by clicking once or twice, respectively, on a column's title. For instance, to sort by the year published, simply click on 'Year' in the top field of the year column.

If available, the Title fields also allow you to quickly access the BibTeX entry, Abstract, or link to a .pdf version of the respective paper. [URL] usually refers to an official link to an abstract or, less often, full paper; [URL(2)] usually refers to a full paper preprint version on our server; [URL(3)] usually refers to some additional material, such as a poster. (Notice that some official full papers have copyright restrictions, e.g. Neural Computation. You may copy them but not post them somewhere else.)



Global QuickSearch:   Number of matching entries: 0

Search Settings

    Author Year Title Reference BibTeX type Project
    Aimone, J.B. & Wiskott, L. 2008 Computational modeling of neurogenesis Chapter 22 in Adult Neurogenesis , Cold Spring Harbor Monograph Series , 52 , 463-481 .
     
    incollection Adult neurogenesis: Function II (2005-2007)
    Abstract: One of the most intriguing differences between ... [This book chapter has no abstract. Please follow the URL, select chapter 22, and read the introduction.]
    BibTeX:
    			
                            @incollection{AimoneWiskott-2008,
                              author       = {James B. Aimone and Laurenz Wiskott},
                              title        = {Computational modeling of neurogenesis},
                              booktitle    = {Adult Neurogenesis},
                              publisher    = {Cold Spring Harbor Laboratory Press},
                              year         = {2008},
                              volume       = {52},
                              pages        = {463--481},
    			  
    			  url          = {http://books.google.com/books?id=5Kyahdob-NsC&printsec=frontcover&dq=Adult+neurogenesis&hl=en&src=bmrr&ei=M9n4Tej5B9DQsgbk58CKCQ&sa=X&oi=book_result&ct=result&resnum=1&ved=0CCkQ6AEwAA#v=onepage&q&f=false}
                            }
    			
    					
    Althoff, O.; Erdmann, A.; Wiskott, L. & Hertel, P. 1991 The photorefractive effect in LiNbO_3 at high light intensity phys. stat. sol. (a) , 128 , K41-K46 .
     
    article Photorefractive effect in LiNbO3 (1989,1990)
    Abstract: In lithium niobate waveguides and also in the bulk material, the refractive index change caused by a very high light intensity is much stronger than would be expected from measurements at low intensities. In this note we present a quantitative investigation of these phenomena and discuss some possible explanations.
    BibTeX:
    			
                            @article{AlthoffErdmannEtAl-1991,
                              author       = {O. Althoff and A. Erdmann and L. Wiskott and P. Hertel},
                              title        = {The photorefractive effect in {LiNbO$_3$} at high light intensity},
                              journal      = {phys.\ stat.\ sol.\ (a)},
                              year         = {1991},
                              volume       = {128},
                              pages        = {K41--K46},
    			  
    			  url          = {http://onlinelibrary.wiley.com/doi/10.1002/pssa.2211280138/abstract},
    			  
                              doi          = {http://doi.org/10.1002/pssa.2211280138}
                            }
    			
    					
    Appleby, P.A.; Kempermann, G. & Wiskott, L. 2011 The role of additive neurogenesis and synaptic plasticity in a hippocampal memory model with grid-cell like input PLoS Comput Biol , 7(1) , e1001063 .
     
    article Adult neurogenesis: Function III (2007-2009)
    Abstract: Contrary to the long-standing belief that no new neurons are added to the adult brain, it is now known that new neurons are born in a number of different brain regions and animals. One such region is the hippocampus, an area that plays an important role in learning and memory. In this paper we explore the effect of adding new neurons in a computational model of rat hippocampal function. Our hypothesis is that adding new neurons helps in forming new memories without disrupting memories that have already been stored. We find that adding new units is indeed superior to either changing connectivity or allowing neuronal turnover (where old units die and are replaced). We then show that a more biologically plausible mechanism that combines all three of these processes produces the best performance. Our work provides a strong theoretical argument as to why new neurons are born in the adult hippocampus: the new units allow the network to adapt in a way that is not possible by rearranging existing connectivity using conventional plasticity or neuronal turnover.
    BibTeX:
    			
                            @article{ApplebyKempermannEtAl-2011,
                              author       = {Appleby, Peter A. AND Kempermann, Gerd AND Wiskott, Laurenz},
                              title        = {The role of additive neurogenesis and synaptic plasticity in a hippocampal memory model with grid-cell like input},
                              journal      = {PLoS Comput Biol},
                              publisher    = {Public Library of Science},
                              year         = {2011},
                              volume       = {7},
                              number       = {1},
                              pages        = {e1001063},
    			  
    			  url          = {http://dx.doi.org/10.1371/journal.pcbi.1001063},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/ApplebyKempermannEtAl-2011-PLoSCompBiol.pdf},
    			  
                              doi          = {http://doi.org/10.1371/journal.pcbi.1001063}
                            }
    			
    					
    Appleby, P.A.; Lezius, S.; Bandt, C.; Kempermann, G. & Wiskott, L. 2007 Neurogenesis avoids catastrophic interference in a sparsely coding dentate gyrus Proc. 3rd Bernstein Symposium for Computational Neuroscience, Sep 24-27, Göttingen, Germany , 41 .
     
    inproceedings Adult neurogenesis: Function II (2005-2007), Adult neurogenesis: Dynamics II (2006-2013)
    BibTeX:
    			
                            @inproceedings{ApplebyLeziusEtAl-2007a,
                              author       = {Peter A. Appleby and Susanne Lezius and Christoph Bandt and Gerd Kempermann and Laurenz Wiskott},
                              title        = {Neurogenesis avoids catastrophic interference in a sparsely coding dentate gyrus},
                              booktitle    = {Proc.\ 3rd Bernstein Symposium for Computational Neuroscience, Sep 24--27, Göttingen, Germany},
                              publisher    = {Bernstein Center for Computational Neuroscience (BCCN) Göttingen},
                              year         = {2007},
                              pages        = {41}
                            }
    			
    					
    Appleby, P.A.; Lezius, S.; Kirste, I.; Bandt, C.; Kempermann, G. & Wiskott, L. 2007 Adult neurogenesis in the dentate gyrus: data analysis and modeling Proc. Midterm Evaluation of the German National Network for Computational Neuroscience, Dec 3-4, Berlin, Germany , 26 .
     
    inproceedings Adult neurogenesis: Function II (2005-2007), Adult neurogenesis: Dynamics II (2006-2013)
    BibTeX:
    			
                            @inproceedings{ApplebyLeziusEtAl-2007b,
                              author       = {Peter A. Appleby and Susanne Lezius and Imke Kirste and Christoph Bandt and Gerd Kempermann and Laurenz Wiskott},
                              title        = {Adult neurogenesis in the dentate gyrus: data analysis and modeling},
                              booktitle    = {Proc.\ Midterm Evaluation of the German National Network for Computational Neuroscience, Dec 3--4, Berlin, Germany},
                              year         = {2007},
                              pages        = {26}
                            }
    			
    					
    Appleby, P.A. & Wiskott, L. 2009 Additive neurogenesis as a strategy for avoiding interference in a sparsely-coding dentate gyrus Network: Computation in Neural Systems , 20(3) , 137-161 .
     
    article Adult neurogenesis: Function II (2005-2007)
    Abstract: Recently we presented a model of additive neurogenesis in a linear, feedforward neural network that performed an encoding-decoding memory task in a changing input environment. Growing the neural network over time allowed the network to adapt to changes in input statistics without disrupting retrieval properties, and we proposed that adult neurogenesis might fulfil a similar computational role in the dentate gyrus of the hippocampus. Here we explicitly evaluate this hypothesis by examining additive neurogenesis in a simplified hippocampal memory model. The model incorporates a divergence in unit number from the entorhinal cortex to the dentate gyrus and sparse coding in the dentate gyrus, both notable features of hippocampal processing. We evaluate two distinct adaptation strategies; neuronal turnover, where the network is of fixed size but units may be deleted and new ones added, and additive neurogenesis, where the network grows over time, and quantify the performance of the network across the full range of adaptation levels from zero in a fixed network to one in a fully adapting network. We find that additive neurogenesis is always superior to neuronal turnover as it permits the network to be responsive to changes in input statistics while at the same time preserving representations of earlier environments.
    BibTeX:
    			
                            @article{ApplebyWiskott-2009,
                              author       = {Peter A. Appleby and Laurenz Wiskott},
                              title        = {Additive neurogenesis as a strategy for avoiding interference in a sparsely-coding dentate gyrus},
                              journal      = {Network: Computation in Neural Systems},
                              year         = {2009},
                              volume       = {20},
                              number       = {3},
                              pages        = {137--161},
    			  
    			  url          = {http://informahealthcare.com/doi/abs/10.1080/09548980902993156},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/ApplebyWiskott-2009-Network-Neurogenesis-Preprint.pdf},
    			  
                              doi          = {http://doi.org/10.1080/09548980902993156}
                            }
    			
    					
    Appleby, P.; Kempermann, G. & Wiskott, L. 2010 The role of neurogenesis in the hippocampus Adult Neurogenesis: Structure and Function, June, Frauenchiemsee, Germany .
     
    inproceedings Adult neurogenesis: Function III (2007-2010)
    BibTeX:
    			
                            @inproceedings{ApplebyKempermannEtAl-2010,
                              author       = {Peter Appleby and Gerd Kempermann and Laurenz Wiskott},
                              title        = {The role of neurogenesis in the hippocampus},
                              booktitle    = {Adult Neurogenesis: Structure and Function, June, Frauenchiemsee, Germany},
                              year         = {2010}
                            }
    			
    					
    Appleby, P. & Wiskott, L. 2006 Adult neurogenesis in the central nervous system Berlin Neuroscience Forum 2006, August, Liebenwalde, Germany .
     
    inproceedings Adult neurogenesis: Function II (2005-2007)
    BibTeX:
    			
                            @inproceedings{ApplebyWiskott-2006,
                              author       = {Peter Appleby and Laurenz Wiskott},
                              title        = {Adult neurogenesis in the central nervous system},
                              booktitle    = {Berlin Neuroscience Forum 2006, August, Liebenwalde, Germany},
                              year         = {2006}
                            }
    			
    					
    Appleby, P. & Wiskott, L. 2007 Additive neurogenesis as a strategy for avoiding catastrophic interference in a sparsely coding dentate gyrus BCCN Symposium, March, Berlin, Germany .
     
    inproceedings Adult neurogenesis: Function II (2005-2007)
    BibTeX:
    			
                            @inproceedings{ApplebyWiskott-2007a,
                              author       = {Peter Appleby and Laurenz Wiskott},
                              title        = {Additive neurogenesis as a strategy for avoiding catastrophic interference in a sparsely coding dentate gyrus},
                              booktitle    = {BCCN Symposium, March, Berlin, Germany},
                              year         = {2007}
                            }
    			
    					
    Appleby, P. & Wiskott, L. 2007 The role of adult neurogenesis in the dentate gyrus Perspectives in Computational Neuroscience Symposium, September, Göttingen, Germany .
     
    inproceedings Adult neurogenesis: Function II (2005-2007)
    BibTeX:
    			
                            @inproceedings{ApplebyWiskott-2007b,
                              author       = {Peter Appleby and Laurenz Wiskott},
                              title        = {The role of adult neurogenesis in the dentate gyrus},
                              booktitle    = {Perspectives in Computational Neuroscience Symposium, September, Göttingen, Germany},
                              year         = {2007}
                            }
    			
    					
    Azizi, A.H.; Wiskott, L. & Cheng, S. 2013 A computational model for preplay in the hippocampus Frontiers in Computational Neuroscience , 7(161) , 1-15 .
     
    article Preplay in the Hippocampus (2010-2013)
    Abstract: The hippocampal network produces sequences of neural activity even when there is no time-varying external drive. In offline states, the temporal sequence in which place cells fire spikes correlates with the sequence of their place fields. Recent experiments found this correlation even between offline sequential activity (OSA) recorded before the animal ran in a novel environment and the place fields in that environment. This preplay phenomenon suggests that OSA is generated intrinsically in the hippocampal network, and not established by external sensory inputs. Previous studies showed that continuous attractor networks with asymmetric patterns of connectivity, or with slow, local negative feedback, can generate sequential activity. This mechanism could account for preplay if the network only represented a single spatial map, or chart. However, global remapping in the hippocampus implies that multiple charts are represented simultaneously in the hippocampal network and it remains unknown whether the network with multiple charts can account for preplay. Here we show that it can. Driven with random inputs, the model generates sequences in every chart. Place fields in a given chart and OSA generated by the network are highly correlated. We also find significant correlations, albeit less frequently, even when the OSA is correlated with a new chart in which place fields are randomly scattered. These correlations arise from random correlations between the orderings of place fields in the new chart and those in a pre-existing chart. Our results suggest two different accounts for preplay. Either an existing chart is re-used to represent a novel environment or a new chart is formed.
    BibTeX:
    			
                            @article{AziziWiskottEtAl-2013,
                              author       = {Azizi, Amir Hossein and Wiskott, Laurenz and Cheng, Sen},
                              title        = {A computational model for preplay in the hippocampus},
                              journal      = {Frontiers in Computational Neuroscience},
                              year         = {2013},
                              volume       = {7},
                              number       = {161},
                              pages        = {1--15},
    			  
    			  url          = {http://www.frontiersin.org/computational_neuroscience/10.3389/fncom.2013.00161/abstract},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/AziziWiskottEtAl-2013-FrontiersCNS.pdf},
    			  
                              doi          = {http://doi.org/10.3389/fncom.2013.00161}
                            }
    			
    					
    Bandt, C.; Beißwanger, E.; Wiskott, L. & Kempermann, G. 2005 A dynamical model for neural cell development Proc. XXV Dynamics Days Europe 2005, Jul 25-28, Berlin, Germany , Europhysics Conference Series , 29 E , 233 .
     
    inproceedings Adult neurogenesis: Dynamics I (2004,2005)
    BibTeX:
    			
                            @inproceedings{BandtBeisswangerEtAl-2005,
                              author       = {Christoph Bandt and Elena Beißwanger and Laurenz Wiskott and Gerd Kempermann},
                              title        = {A dynamical model for neural cell development},
                              booktitle    = {Proc.\ XXV Dynamics Days Europe 2005, Jul 25--28, Berlin, Germany},
                              year         = {2005},
                              volume       = {29 E},
                              pages        = {233}
                            }
    			
    					
    Baucks, F.; Leschke, J.; Metzger, C. & Wiskott, L. 2023 Ein Dashboard für die Studienberatung: Technische Infrastruktur und Studienverlaufsplanung im Projekt KI:edu.nrw Workshop Proceedings of the 21th Fachtagung Bildungstechnologien (DELFI) .
     
    inproceedings
    BibTeX:
    			
                            @inproceedings{BaucksLeschkeEtAl-2023,
                              author       = {Baucks, Frederik and Leschke, Jonas and Metzger, Christian and Wiskott, Laurenz},
                              title        = {Ein Dashboard für die Studienberatung: Technische Infrastruktur und Studienverlaufsplanung im Projekt KI:edu.nrw},
                              booktitle    = {Workshop Proceedings of the 21th Fachtagung Bildungstechnologien (DELFI)},
                              publisher    = {Gesellschaft für Informatik e.V.},
                              year         = {2023}
                            }
    			
    					
    Baucks, F.; Schmucker, R. & Wiskott, L. 2023 Tracing Changes in University Course Difficulty Using Item-Response Theory Proc. AAAI Workshop on AI for Education .
     
    inproceedings
    Abstract: Curriculum analytics (CA) studies educational program structure and student data to ensure the quality of courses inside a curriculum. Ensuring low variation in course difficulty over time is crucial to warrant equal treatment of individual student cohorts and consistent degree outcomes. Still, existing CA techniques (e.g., process mining/simulation and curriculum-based prediction) are unable to capture such temporal variations due to their central assumption of timeinvariant course behavior. In this paper, we introduce item response theory (IRT) as a new methodology to the CA domain to address the open problem of tracing changes in course difficulty over time. We show the suitability of IRT to capture variance in course performance data and assess the validity and reliability of IRT-based difficulty estimates. Using data from 664 CS Bachelor students, we show how IRT can yield valuable insights by revealing variations in course difficulty over multiple years. Furthermore, we observe a systematic shift in course difficulty during the COVID-19 pandemic.
    BibTeX:
    			
                            @inproceedings{BaucksSchmuckerEtAl-2023,
                              author       = {Baucks, Frederik and Schmucker, Robin and Wiskott, Laurenz},
                              title        = {Tracing Changes in University Course Difficulty Using Item-Response Theory},
                              booktitle    = {Proc. AAAI Workshop on AI for Education},
                              year         = {2023},
    			  
    			  url          = {https://ai4ed.cc/workshops/aaai2023}
                            }
    			
    					
    Baucks, F. & Wiskott, L. 2022 Simulating Policy Changes in Prerequisite-Free Curricula: A Supervised Data-Driven Approach Proc. 15th International Conference on Educational Data Mining , 470–476 .
     
    inproceedings
    Abstract: Curriculum research is an important tool for understanding complex processes within a degree program. In particular, stochastic graphical models and simulations on related curriculum graphs have been used to make predictions about dropout rates, grades and degree completion time. There exists, however, little research on changes in the curriculum and the evaluation of their impact. The available evaluation methods of curriculum changes assume pre-existing strict curriculum graphs in the form of directed acyclic graphs. These allow for a straightforward model-oriented probabilistic or graph topological investigation of curricula. But the existence of such graphs cannot generally be assumed. We present a novel generalizing approach in which a curriculum graph is constructed based on data, using measurable student flow. By applying a discrete event simulation, we investigate the impact of policy changes on the curriculum and evaluate our approach on a sample data set from a German university. Our method is able to create a comparably effective and individually verifiable simulation without requiring a curriculum graph. It can thus be extended to prerequisite-free curricula, making it feasible to evaluate changes to flexible curricula.
    BibTeX:
    			
                            @inproceedings{BaucksWiskott-2022,
                              author       = {Baucks, Frederik and Wiskott, Laurenz},
                              title        = {Simulating Policy Changes in Prerequisite-Free Curricula: A Supervised Data-Driven Approach},
                              booktitle    = {Proc. 15th International Conference on Educational Data Mining},
                              publisher    = {International Educational Data Mining Society},
                              year         = {2022},
                              pages        = {470–476},
    			  
                              doi          = {http://doi.org/10.5281/zenodo.6853177}
                            }
    			
    					
    Baucks, F. & Wiskott, L. 2023 Mitigating Biases using an Additive Grade Point Model: Towards Trustworthy Curriculum Analytics Measures Proc. 21. Fachtagung Bildungstechnologien (DELFI) , 41–52 .
    (Best Paper Nominee)  
    inproceedings
    BibTeX:
    			
                            @inproceedings{BaucksWiskott-2023a,
                              author       = {Baucks, Frederik and Wiskott, Laurenz},
                              title        = {Mitigating Biases using an Additive Grade Point Model: Towards Trustworthy Curriculum Analytics Measures},
                              booktitle    = {Proc. 21. Fachtagung Bildungstechnologien (DELFI)},
                              publisher    = {Gesellschaft für Informatik e.V.},
                              year         = {2023},
                              pages        = {41–52},
    			  
                              doi          = {http://doi.org/10.18420/delfi2023-12}
                            }
    			
    					
    Baucks, F. & Wiskott, L. 2023 Von der Forschung in die Praxis: Entwicklung eines Dashboards für die Studienberatung Abstract & Talk at 2nd Learning AID .
     
    misc
    BibTeX:
    			
                            @misc{BaucksWiskott-2023b,
                              author       = {Baucks, Frederik and Wiskott, Laurenz},
                              title        = {Von der Forschung in die Praxis: Entwicklung eines Dashboards für die Studienberatung},
                              year         = {2023},
                              howpublished = {Abstract \& Talk at 2nd Learning AID}
                            }
    			
    					
    Bayati, M.; Melchior, J.; Wiskott, L. & Cheng, S. 2017 Generating sequences in recurrent neural networks for storing and retrieving episodic memories Proc. 26th Annual Computational Neuroscience Meeting (CNS*2017): Part 2 .
    (Special issue of BMC Neuroscience 18(Suppl 1):P31)  
    inproceedings
    BibTeX:
    			
                            @inproceedings{BayatiMelchiorEtAl-2017,
                              author       = {Mehdi Bayati and Jan Melchior and Laurenz Wiskott and Sen Cheng},
                              title        = {Generating sequences in recurrent neural networks for storing and retrieving episodic memories},
                              booktitle    = {Proc. 26th Annual Computational Neuroscience Meeting (CNS*2017): Part 2},
                              year         = {2017},
    			  
    			  url          = {http://europepmc.org/articles/PMC5592442}
                            }
    			
    					
    Bayati, M.; Neher, T.; Melchior, J.; Diba, K.; Wiskott, L. & Cheng, S. 2018 Storage fidelity for sequence memory in the hippocampal circuit PLoS One , 13(10) , 1-33 .
     
    article
    Abstract: Episodic memories have been suggested to be represented by neuronal sequences, which are stored and retrieved from the hippocampal circuit. A special difficulty is that realistic neuronal sequences are strongly correlated with each other since computational memory models generally perform poorly when correlated patterns are stored. Here, we study in a computational model under which conditions the hippocampal circuit can perform this function robustly. During memory encoding, CA3 sequences in our model are driven by intrinsic dynamics, entorhinal inputs, or a combination of both. These CA3 sequences are hetero-associated with the input sequences, so that the network can retrieve entire sequences based on a single cue pattern. We find that overall memory performance depends on two factors: the robustness of sequence retrieval from CA3 and the circuit’s ability to perform pattern completion through the feedforward connectivity, including CA3, CA1 and EC. The two factors, in turn, depend on the relative contribution of the external inputs and recurrent drive on CA3 activity. In conclusion, memory performance in our network model critically depends on the network architecture and dynamics in CA3.
    BibTeX:
    			
                            @article{BayatiNeherEtAl-2018,
                              author       = {Bayati, Mehdi AND Neher, Torsten AND Melchior, Jan AND Diba, Kamran AND Wiskott, Laurenz AND Cheng, Sen},
                              title        = {Storage fidelity for sequence memory in the hippocampal circuit},
                              journal      = {PLoS One},
                              publisher    = {Public Library of Science},
                              year         = {2018},
                              volume       = {13},
                              number       = {10},
                              pages        = {1-33},
    			  
    			  url          = {https://doi.org/10.1371/journal.pone.0204685},
    			  
                              doi          = {http://doi.org/10.1371/journal.pone.0204685}
                            }
    			
    					
    Beißwanger, E. 2005 Modeling adult neurogenesis in the hippocampus Diploma thesis, Department of Mathematics and Computer Science , Ernst-Moritz-Arndt-University Greifswald, Germany .
     
    mastersthesis Adult neurogenesis: Dynamics I (2004,2005)
    Abstract: It is a distinctive feature of the hippocampus of the mammalian brain to generate new neurons throughout life. Neuronal progenitor cells pass through several steps of maturation until they reach adulthood and full functionality. In a mouse model six developmental stages have been defined which represent consecutive phases of adult neurogenesis. The early stages are highly proliferative, then the cells become postmitotic. Using the thymidine substitute BrdU as a marker for proliferating cells permits to detect the number of cells in every stage at consecutive times. The resulting test series indicate the dynamics of neuronal development. In the study we presented here we examine the reliability of the observed cell counts, discussing the experimental design, which has been used and especially attending the problems of the BrdU labeling method. Subsequently we establish a simple model for adult neurogenesis based on a system of linear ordinary differential equations. In a second approach we apply a discrete model, based on Leslie matrices. We analyze different scenarios of neuronal development to find out which one fits the data best and consequently might describe the real situation.
    BibTeX:
    			
                            @mastersthesis{Beisswanger-2005,
                              author       = {Elena Beißwanger},
                              title        = {Modeling adult neurogenesis in the hippocampus},
                              school       = {Department of Mathematics and Computer Science},
                              year         = {2005}
                            }
    			
    					
    Berkes, P. 2005 Pattern recognition with slow feature analysis Cognitive Sciences EPrint Archive (CogPrints) , 4104 .
     
    misc Handwritten digit recognition (2005)
    BibTeX:
    			
                            @misc{Berkes-2005a,
                              author       = {Pietro Berkes},
                              title        = {Pattern recognition with slow feature analysis},
                              year         = {2005},
                              volume       = {4104},
                              howpublished = {Cognitive Sciences EPrint Archive (CogPrints)},
    			  
    			  url          = {http://cogprints.org/4104/}
                            }
    			
    					
    Berkes, P. 2005 Handwritten digit recognition with nonlinear Fisher Discriminant Analysis Proc. Intl. Conf. on Artificial Neur. Netw. (ICANN'05) , Lecture Notes on Computer Science , 3696(2) , 285-287 .
     
    inproceedings Handwritten digit recognition (2005)
    BibTeX:
    			
                            @inproceedings{Berkes-2005b,
                              author       = {Berkes, Pietro},
                              title        = {Handwritten digit recognition with nonlinear {F}isher {D}iscriminant {A}nalysis},
                              booktitle    = {Proc. Intl.\ Conf.\ on Artificial Neur.\ Netw.\ (ICANN'05)},
                              publisher    = {Springer},
                              year         = {2005},
                              volume       = {3696},
                              number       = {2},
                              pages        = {285--287}
                            }
    			
    					
    Berkes, P. 2005 Temporal slowness as an unsupervised learning principle: self-organization of complex-cell receptive fields and application to pattern recognition PhD thesis, Institute for Biology , Humboldt University Berlin, D-10099 Berlin, Germany .
     
    phdthesis SFA: Complex cells (2001-2003), Analysis of quadratic forms (2002-2004), Handwritten digit recognition (2005)
    BibTeX:
    			
                            @phdthesis{Berkes-2005c,
                              author       = {Pietro Berkes},
                              title        = {Temporal slowness as an unsupervised learning principle: self-organization of complex-cell receptive fields and application to pattern recognition},
                              school       = {Institute for Biology},
                              year         = {2005}
                            }
    			
    					
    Berkes, P. & Wiskott, L. 2002 Applying Slow Feature Analysis to image sequences yields a rich repertoire of complex cell properties Proc. Intl. Conf. on Artificial Neural Networks (ICANN'02) , Lecture Notes in Computer Science , 81-86 .
     
    inproceedings SFA: Complex cells (2001-2003)
    BibTeX:
    			
                            @inproceedings{BerkesWiskott-2002,
                              author       = {Pietro Berkes and Laurenz Wiskott},
                              title        = {Applying {S}low {F}eature {A}nalysis to image sequences yields a rich repertoire of complex cell properties},
                              booktitle    = {Proc.\ Intl.\ Conf.\ on Artificial Neural Networks (ICANN'02)},
                              publisher    = {Springer},
                              year         = {2002},
                              pages        = {81--86},
    			  
    			  url          = {https://link.springer.com/chapter/10.1007/3-540-46084-5_14},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/BerkesWiskott-2002-ProcICANN-SFAComplexCells-Preprint.pdf},
    			  
                              doi          = {http://doi.org/10.1007/3-540-46084-5_14}
                            }
    			
    					
    Berkes, P. & Wiskott, L. 2003 Slow feature analysis yields a rich repertoire of complex-cells properties Proc. 29th Göttingen Neurobiology Conference, Göttingen, Germany , 602-603 .
     
    inproceedings SFA: Complex cells (2001-2003)
    BibTeX:
    			
                            @inproceedings{BerkesWiskott-2003b,
                              author       = {Pietro Berkes and Laurenz Wiskott},
                              title        = {Slow feature analysis yields a rich repertoire of complex-cells properties},
                              booktitle    = {Proc.\ 29th Göttingen Neurobiology Conference, Göttingen, Germany},
                              publisher    = {Georg Thieme Verlag},
                              year         = {2003},
                              pages        = {602--603}
                            }
    			
    					
    Berkes, P. & Wiskott, L. 2003 Slow Feature Analysis yields a rich repertoire of complex-cell properties Cognitive Sciences EPrint Archive (CogPrints) , 2804 .
     
    misc SFA: Complex cells (2001-2003)
    BibTeX:
    			
                            @misc{BerkesWiskott-2003a,
                              author       = {Pietro Berkes and Laurenz Wiskott},
                              title        = {Slow {F}eature {A}nalysis yields a rich repertoire of complex-cell properties},
                              year         = {2003},
                              volume       = {2804},
                              howpublished = {Cognitive Sciences EPrint Archive (CogPrints)},
    			  
    			  url          = {http://cogprints.org/2804/}
                            }
    			
    					
    Berkes, P. & Wiskott, L. 2004 Slow Feature Analysis yields a rich repertoire of complex-cells properties Proc. Early Cognitive Vision Workshop, May 28 - Jun 1, Isle Of Skye, Scotland .
     
    inproceedings SFA: Complex cells (2001-2003)
    BibTeX:
    			
                            @inproceedings{BerkesWiskott-2004,
                              author       = {Pietro Berkes and Laurenz Wiskott},
                              title        = {Slow {F}eature {A}nalysis yields a rich repertoire of complex-cells properties},
                              booktitle    = {Proc.\ Early Cognitive Vision Workshop, May 28 -- Jun 1, Isle Of Skye, Scotland},
                              year         = {2004}
                            }
    			
    					
    Berkes, P. & Wiskott, L. 2005 On the analysis and interpretation of inhomogeneous quadratic forms as receptive fields Cognitive Sciences EPrint Archive (CogPrints) , 4081 .
     
    misc Analysis of quadratic forms (2002-2004)
    BibTeX:
    			
                            @misc{BerkesWiskott-2005a,
                              author       = {Pietro Berkes and Laurenz Wiskott},
                              title        = {On the analysis and interpretation of inhomogeneous quadratic forms as receptive fields},
                              year         = {2005},
                              volume       = {4081},
                              howpublished = {Cognitive Sciences EPrint Archive (CogPrints)},
    			  
    			  url          = {http://cogprints.org/4081/}
                            }
    			
    					
    Berkes, P. & Wiskott, L. 2005 Analysis of inhomogeneous quadratic forms for physiological and theoretical studies Proc. Computational and Systems Neuroscience (COSYNE'05), Salk Lake City, USA .
     
    inproceedings Analysis of quadratic forms (2002-2004)
    BibTeX:
    			
                            @inproceedings{BerkesWiskott-2005b,
                              author       = {Pietro Berkes and Laurenz Wiskott},
                              title        = {Analysis of inhomogeneous quadratic forms for physiological and theoretical studies},
                              booktitle    = {Proc.\ Computational and Systems Neuroscience (COSYNE'05), Salk Lake City, USA},
                              year         = {2005}
                            }
    			
    					
    Berkes, P. & Wiskott, L. 2005 Slow Feature Analysis yields a rich repertoire of complex cell properties Journal of Vision , 5(6) , 579-602 .
     
    article SFA: Complex cells (2001-2003)
    Abstract: In this study, we investigate temporal slowness as a learning principle for receptive fields using slow feature analysis, a new algorithm to determine functions that extract slowly varying signals from the input data. We find a good qualitative and quantitative match between the set of learned functions trained on image sequences and the population of complex cells in the primary visual cortex (V1). The functions show many properties found also experimentally in complex cells, such as direction selectivity, non-orthogonal inhibition, end-inhibition, and side-inhibition. Our results demonstrate that a single unsupervised learning principle can account for such a rich repertoire of receptive field properties.
    BibTeX:
    			
                            @article{BerkesWiskott-2005c,
                              author       = {Pietro Berkes and Laurenz Wiskott},
                              title        = {Slow {F}eature {A}nalysis yields a rich repertoire of complex cell properties},
                              journal      = {Journal of Vision},
                              year         = {2005},
                              volume       = {5},
                              number       = {6},
                              pages        = {579--602},
    			  
    			  url          = {http://journalofvision.org/5/6/9/},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/BerkesWiskott-2005c-JoV-SFAComplexCells.pdf},
    			  
                              doi          = {http://doi.org/10.1167/5.6.9}
                            }
    			
    					
    Berkes, P. & Wiskott, L. 2006 On the analysis and interpretation of inhomogeneous quadratic forms as receptive fields Neural Computation , 18(8) , 1868-1895 .
     
    article Analysis of quadratic forms (2002-2004)
    Abstract: In this letter, we introduce some mathematical and numerical tools to analyze and interpret inhomogeneous quadratic forms. The resulting characterization is in some aspects similar to that given by experimental studies of cortical cells, making it particularly suitable for application to second-order approximations and theoretical models of physiological receptive fields. We first discuss two ways of analyzing a quadratic form by visualizing the coefficients of its quadratic and linear term directly and by considering the eigenvectors of its quadratic term. We then present an algorithm to compute the optimal excitatory and inhibitory stimuli?those that maximize and minimize the considered quadratic form, respectively, given a fixed energy constraint. The analysis of the optimal stimuli is completed by considering their invariances, which are the transformations to which the quadratic form is most insensitive, and by introducing a test to determine which of these are statistically significant. Next we propose a way to measure the relative contribution of the quadratic and linear term to the total output of the quadratic form. Furthermore, we derive simpler versions of the above techniques in the special case of a quadratic form without linear term. In the final part of the letter, we show that for each quadratic form, it is possible to build an equivalent two-layer neural network, which is compatible with (but more general than) related networks used in some recent articles and with the energy model of complex cells. We show that the neural network is unique only up to an arbitrary orthogonal transformation of the excitatory and inhibitory subunits in the first layer.
    BibTeX:
    			
                            @article{BerkesWiskott-2006,
                              author       = {Pietro Berkes and Laurenz Wiskott},
                              title        = {On the analysis and interpretation of inhomogeneous quadratic forms as receptive fields},
                              journal      = {Neural Computation},
                              year         = {2006},
                              volume       = {18},
                              number       = {8},
                              pages        = {1868--1895},
    			  
    			  url          = {http://dx.doi.org/10.1162/neco.2006.18.8.1868},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/BerkesWiskott-2006-NeurComp-QuadraticFormRFs.pdf},
    			  
                              doi          = {http://doi.org/10.1162/neco.2006.18.8.1868}
                            }
    			
    					
    Berkes, P. & Wiskott, L. 2007 Analysis and interpretation of quadratic models of receptive fields Nature Protocols , 2(2) , 400-407 .
     
    article Analysis of quadratic forms (2002-2004)
    Abstract: In this protocol, we present a procedure to analyze and visualize models of neuronal input-output functions that have a quadratic, a linear and a constant term, to determine their overall behavior. The suggested interpretations are close to those given by physiological studies of neurons, making the proposed methods particularly suitable for the analysis of receptive fields resulting from physiological measurements or model simulations.
    BibTeX:
    			
                            @article{BerkesWiskott-2007,
                              author       = {Berkes, P. and Wiskott, L.},
                              title        = {Analysis and interpretation of quadratic models of receptive fields},
                              journal      = {Nature Protocols},
                              year         = {2007},
                              volume       = {2},
                              number       = {2},
                              pages        = {400--407},
    			  
    			  url          = {http://dx.doi.org/10.1038/nprot.2007.27},
    			  
                              doi          = {http://doi.org/10.1038/nprot.2007.27}
                            }
    			
    					
    Berkes, P. & Zito, T. 2005 Modular toolkit for Data Processing (MDP) Proc. Europython 2005, Jun 27-29, Gothenburg .
     
    inproceedings MDP: Modular toolkit for data processing (2003-now)
    BibTeX:
    			
                            @inproceedings{BerkesZito-2005,
                              author       = {P. Berkes and T. Zito},
                              title        = {Modular toolkit for {D}ata {P}rocessing ({MDP})},
                              booktitle    = {Proc.\ Europython 2005, Jun 27--29, Gothenburg},
                              year         = {2005}
                            }
    			
    					
    Berkes, P. & Zito, T. 2007 Modular toolkit for Data Processing (MDP version 2.1) http://mdp-toolkit.sourceforge.net/ .
    ((first version 0.9 published Aug 2004))  
    misc MDP: Modular toolkit for data processing (2003-now)
    BibTeX:
    			
                            @misc{BerkesZito-2007,
                              author       = {Pietro Berkes and Tiziano Zito},
                              title        = {Modular toolkit for {D}ata {P}rocessing ({MDP} version 2.1)},
                              year         = {2007},
                              howpublished = {\url{http://mdp-toolkit.sourceforge.net/}}
                            }
    			
    					
    Blaschke, T. 2005 Independent Component Analysis and Slow Feature Analysis: relations and combination PhD thesis, Institute for Physics, Humboldt University Berlin, Germany , Institute for Physics, Humboldt University Berlin, Germany .
     
    phdthesis Improved cumulant based ICA (2001,2002), SFA versus ICA (2002-2004), Independent slow feature analysis (ISFA) (2003-2005)
    Abstract: Within this thesis, we focus on the relation between independent component analysis (ICA) and slow feature analysis (SFA). To allow a comparison between both methods we introduce CuBICA2, an ICA algorithm based on second-order statistics only, i.e. cross-correlations. In contrast to algorithms based on higher-order statistics not only instantaneous cross-correlations but also time-delayed cross correlations are considered for minimization. CuBICA2 requires signal components with auto-correlation like in SFA, and has the ability to separate source signal components that have a Gaussian distribution. Furthermore, we derive an alternative formulation of the SFA objective function and compare it with that of CuBICA2. In the case of a linear mixture the two methods are equivalent if a single time delay is taken into account. The comparison can not be extended to the case of several time delays. For ICA a straightforward extension can be derived, but a similar extension to SFA yields an objective function that can not be interpreted in the sense of SFA. However, a useful extension in the sense of SFA to more than one time delay can be derived. This extended SFA reveals the close connection between the slowness objective of SFA and temporal predictability. Furthermore, we combine CuBICA2 and SFA. The result can be interpreted from two perspectives. From the ICA point of view the combination leads to an algorithm that solves the nonlinear blind source separation problem. From the SFA point of view the combination of ICA and SFA is an extension to SFA in terms of statistical independence. Standard SFA extracts slowly varying signal components that are uncorrelated meaning they are statistically independent up to second-order. The integration of ICA leads to signal components that are more or less statistically independent.
    BibTeX:
    			
                            @phdthesis{Blaschke-2005,
                              author       = {Tobias Blaschke},
                              title        = {Independent {C}omponent {A}nalysis and {S}low {F}eature {A}nalysis: relations and combination},
                              school       = {Institute for Physics, Humboldt University Berlin, Germany},
                              year         = {2005},
    			  
    			  url          = {http://edoc.hu-berlin.de/docviews/abstract.php?lang=ger&id=25458},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/Blaschke-2005-Dissertation.pdf},
    			  
                              doi          = {http://doi.org/10.18452/15270}
                            }
    			
    					
    Blaschke, T.; Berkes, P. & Wiskott, L. 2006 What is the relationship between Slow Feature Analysis and Independent Component Analysis? Neural Computation , 18(10) , 2495-2508 .
     
    article SFA versus ICA (2002-2004)
    Abstract: We present an analytical comparison between linear slow feature analysis and second-order independent component analysis, and show that in the case of one time delay the two approaches are equivalent. We also consider the case of several time delays and discuss two possible extensions of slow feature analysis.
    BibTeX:
    			
                            @article{BlaschkeBerkesEtAl-2006,
                              author       = {T. Blaschke and P. Berkes and L. Wiskott},
                              title        = {What is the relationship between {S}low {F}eature {A}nalysis and {I}ndependent {C}omponent {A}nalysis?},
                              journal      = {Neural Computation},
                              year         = {2006},
                              volume       = {18},
                              number       = {10},
                              pages        = {2495--2508},
    			  
    			  url          = {http://www.mitpressjournals.org/doi/abs/10.1162/neco.2006.18.10.2495},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/BlaschkeBerkesEtAl-2006-NeurComp-SFAvsICA.pdf},
    			  
                              doi          = {http://doi.org/10.1162/neco.2006.18.10.2495}
                            }
    			
    					
    Blaschke, T. & Wiskott, L. 2002 An improved cumulant based method for Independent Component Analysis Proc. Intl. Conf. on Artificial Neural Networks (ICANN'02) , Lecture Notes in Computer Science , 1087-1093 .
     
    inproceedings Improved cumulant based ICA (2001,2002)
    BibTeX:
    			
                            @inproceedings{BlaschkeWiskott-2002,
                              author       = {Tobias Blaschke and Laurenz Wiskott},
                              title        = {An improved cumulant based method for {I}ndependent {C}omponent {A}nalysis},
                              booktitle    = {Proc.\ Intl.\ Conf.\ on Artificial Neural Networks (ICANN'02)},
                              publisher    = {Springer},
                              year         = {2002},
                              pages        = {1087--1093},
    			  
    			  url          = {https://link.springer.com/chapter/10.1007/3-540-46084-5_176},
    			  
                              doi          = {http://doi.org/10.1007/3-540-46084-5_176}
                            }
    			
    					
    Blaschke, T. & Wiskott, L. 2005 Nonlinear blind source separation by integrating Independent Component Analysis and Slow Feature Analysis Proc. Advances in Neural Information Processing Systems 17 (NIPS'04) , 177-184 .
     
    inproceedings Independent slow feature analysis (ISFA) (2003-2005)
    BibTeX:
    			
                            @inproceedings{BlaschkeWiskott-2005,
                              author       = {T. Blaschke and L. Wiskott},
                              title        = {Nonlinear blind source separation by integrating {I}ndependent {C}omponent {A}nalysis and {S}low {F}eature {A}nalysis},
                              booktitle    = {Proc.\ Advances in Neural Information Processing Systems 17 (NIPS'04)},
                              publisher    = {The MIT Press},
                              year         = {2005},
                              pages        = {177--184}
                            }
    			
    					
    Blaschke, T. & Wiskott, L. 2003 CuBICA: Independent Component Analysis by simultaneous third- and fourth-order cumulant diagonalization Computer Science Preprint Server (CSPS): Computational Intelligence/0304002 .
     
    misc Improved cumulant based ICA (2001,2002)
    BibTeX:
    			
                            @misc{BlaschkeWiskott-2003,
                              author       = {Tobias Blaschke and Laurenz Wiskott},
                              title        = {{CuBICA}: {I}ndependent {C}omponent {A}nalysis by simultaneous third- and fourth-order cumulant diagonalization},
                              year         = {2003},
                              howpublished = {Computer Science Preprint Server (CSPS): Computational Intelligence/0304002}
                            }
    			
    					
    Blaschke, T. & Wiskott, L. 2004 CuBICA: Independent Component Analysis by simultaneous third- and fourth-order cumulant diagonalization IEEE Trans. on Signal Processing , 52(5) , 1250-1256 .
     
    article Improved cumulant based ICA (2001,2002)
    Abstract: CuBICA, an improved method for independent component analysis (ICA) based on the diagonalization of cumulant tensors is proposed. It is based on Comon's algorithm [Comon, 1994] but it takes third- and fourth-order cumulant tensors into account simultaneously. The underlying contrast function is also mathematically much simpler and has a more intuitive interpretation. It is therefore easier to optimize and approximate. A comparison with Comon's and three other ICA-algorithms on different data sets demonstrates its performance.
    BibTeX:
    			
                            @article{BlaschkeWiskott-2004a,
                              author       = {T. Blaschke and L. Wiskott},
                              title        = {{CuBICA}: {I}ndependent {C}omponent {A}nalysis by simultaneous third- and fourth-order cumulant diagonalization},
                              journal      = {IEEE Trans.\ on Signal Processing},
                              year         = {2004},
                              volume       = {52},
                              number       = {5},
                              pages        = {1250--1256},
    			  
    			  url          = {http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=1284823},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/BlaschkeWiskott-2004a-IEEE-SP-CuBICA-Preprint.pdf},
    			  
                              doi          = {http://doi.org/10.1109/TSP.2004.826173}
                            }
    			
    					
    Blaschke, T. & Wiskott, L. 2004 Independent Slow Feature Analysis and nonlinear blind source separation Proc. of the 5th Int. Conf. on Independent Component Analysis and Blind Signal Separation (ICA'04), Granada, Spain , Lecture Notes in Computer Science .
     
    inproceedings Independent slow feature analysis (ISFA) (2003-2005)
    BibTeX:
    			
                            @inproceedings{BlaschkeWiskott-2004b,
                              author       = {T. Blaschke and L. Wiskott},
                              title        = {Independent {S}low {F}eature {A}nalysis and nonlinear blind source separation},
                              booktitle    = {Proc. of the 5th Int. Conf. on Independent Component Analysis and Blind Signal Separation (ICA'04), Granada, Spain},
                              publisher    = {Springer},
                              year         = {2004},
    			  
    			  url          = {https://link.springer.com/chapter/10.1007/978-3-540-30110-3_94},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/BlaschkeWiskott-2004b-ProcICA-ISFA-Preprint.pdf},
    			  
                              doi          = {http://doi.org/10.1007/978-3-540-30110-3_94}
                            }
    			
    					
    Blaschke, T.; Zito, T. & Wiskott, L. 2007 Independent Slow Feature Analysis and nonlinear blind source separation Neural Computation , 19(4) , 994-1021 .
     
    article Independent slow feature analysis (ISFA) (2003-2005)
    Abstract: In the linear case statistical independence is a sufficient criterion for performing blind source separation. In the nonlinear case, however, it leaves an ambiguity in the solutions that has to be resolved by additional criteria. Here we argue that temporal slowness complements statistical independence well and that a combination of the two leads to unique solutions of the nonlinear blind source separation problem. The algorithm we present is a combination of second-order Independent Component Analysis and Slow Feature Analysis and is referred to as Independent Slow Feature Analysis. Its performance is demonstrated on nonlinearly mixed music data. We conclude that slowness is indeed a useful complement to statistical independence but that time-delayed second-order moments are only a weak measure of statistical independence.
    BibTeX:
    			
                            @article{BlaschkeZitoEtAl-2007,
                              author       = {Tobias Blaschke and Tiziano Zito and Laurenz Wiskott},
                              title        = {Independent {S}low {F}eature {A}nalysis and nonlinear blind source separation},
                              journal      = {Neural Computation},
                              year         = {2007},
                              volume       = {19},
                              number       = {4},
                              pages        = {994--1021},
    			  
    			  url          = {http://neco.mitpress.org/cgi/content/abstract/19/4/994},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/BlaschkeZitoEtAl-2007-NeurComp-ISFA.pdf},
    			  
                              doi          = {http://doi.org/10.1162/neco.2007.19.4.994}
                            }
    			
    					
    Bollenbacher, J.; Soulier, F.; Rhein, B. & Wiskott, L. 2020 Investigating Parallelization of MAML Discovery Science , 294-306 .
     
    inproceedings
    Abstract: We propose a meta-learning framework to distribute Model-Agnostic Meta-Learning (DMAML), a widely used meta-learning algorithm, over multiple workers running in parallel. DMAML enables us to use
    multiple servers for learning and might be crucial if we want to tackle more challenging problems that often require more CPU time for simulation. In this work, we apply distributed MAML on supervised
    regression and image recognition tasks, which are quasi benchmark tasks in the field of meta-learning. We show the impact of parallelization w.r.t. wall clock time. Therefore, we compare distributing MAML over
    multiple workers and merging the model parameters after parallel learning with parallelizing MAML itself. We also investigate the impact of the hyperparameters on learning and point out further potential
    improvements.
    BibTeX:
    			
                            @inproceedings{BollenbacherSoulierEtAl-2020,
                              author       = {Bollenbacher, Jan and Soulier, Florian and Rhein, Beate and Wiskott, Laurenz},
                              title        = {Investigating Parallelization of MAML},
                              booktitle    = {Discovery Science},
                              publisher    = {Springer International Publishing},
                              year         = {2020},
                              pages        = {294--306}
                            }
    			
    					
    Butz, M. & van Ooyen, A. 2013 A simple rule for dendritic spine and axonal bouton formation can account for cortical reorganization after focal retinal lesions PLoS Comput. Biol. , 9(10) , e1003259 .
     
    article N.N.
    Abstract: Lasting alterations in sensory input trigger massive structural and functional adaptations in cortical networks. The principles governing these experience-dependent changes are, however, poorly understood. Here, we examine whether a simple rule based on the neurons' need for homeostasis in electrical activity may serve as driving force for cortical reorganization. According to this rule, a neuron creates new spines and boutons when its level of electrical activity is below a homeostatic set-point and decreases the number of spines and boutons when its activity exceeds this set-point. In addition, neurons need a minimum level of activity to form spines and boutons. Spine and bouton formation depends solely on the neuron's own activity level, and synapses are formed by merging spines and boutons independently of activity. Using a novel computational model, we show that this simple growth rule produces neuron and network changes as observed in the visual cortex after focal retinal lesions. In the model, as in the cortex, the turnover of dendritic spines was increased strongest in the center of the lesion projection zone, while axonal boutons displayed a marked overshoot followed by pruning. Moreover, the decrease in external input was compensated for by the formation of new horizontal connections, which caused a retinotopic remapping. Homeostatic regulation may provide a unifying framework for understanding cortical reorganization, including network repair in degenerative diseases or following focal stroke.
    BibTeX:
    			
                            @article{ButzOoyen-2013,
                              author       = {Butz, M. and van Ooyen, A.},
                              title        = {{A} simple rule for dendritic spine and axonal bouton formation can account for cortical reorganization after focal retinal lesions},
                              journal      = {PLoS Comput. Biol.},
                              year         = {2013},
                              volume       = {9},
                              number       = {10},
                              pages        = {e1003259},
    			  
    			  url          = {http://www.ploscompbiol.org/article/info:doi/10.1371/journal.pcbi.1003259},
    			  
                              doi          = {http://doi.org/10.1371/journal.pcbi.1003259}
                            }
    			
    					
    Butz, M.; Steenbuck, I.D. & van Ooyen, A. 2014 Homeostatic structural plasticity increases the efficiency of small-world networks Frontiers in Synaptic Neuroscience , 6(7) .
     
    article N.N.
    Abstract: In networks with small-world topology, which are characterized by a high clustering coefficient and a short characteristic path length, information can be transmitted efficiently and at relatively low costs. The brain is composed of small-world networks, and evolution may have optimized brain connectivity for efficient information processing. Despite many studies on the impact of topology on information processing in neuronal networks, little is known about the development of network topology and the emergence of efficient small-world networks. We investigated how a simple growth process that favors short-range connections over long-range connections in combination with a synapse formation rule that generates homeostasis in post-synaptic firing rates shapes neuronal network topology. Interestingly, we found that small-world networks benefited from homeostasis by an increase in efficiency, defined as the averaged inverse of the shortest paths through the network. Efficiency particularly increased as small-world networks approached the desired level of electrical activity. Ultimately, homeostatic small-world networks became almost as efficient as random networks. The increase in efficiency was caused by the emergent property of the homeostatic growth process that neurons started forming more long-range connections, albeit at a low rate, when their electrical activity was close to the homeostatic set-point. Although global network topology continued to change when neuronal activities were around the homeostatic equilibrium, the small-world property of the network was maintained over the entire course of development. Our results may help understand how complex systems such as the brain could set up an efficient network topology in a self-organizing manner. Insights from our work may also lead to novel techniques for constructing large-scale neuronal networks by self-organization.
    BibTeX:
    			
                            @article{ButzSteenbuckEtAl-2014,
                              author       = {Butz, Markus and Steenbuck, Ines Derya and van Ooyen, Arjen},
                              title        = {Homeostatic structural plasticity increases the efficiency of small-world networks},
                              journal      = {Frontiers in Synaptic Neuroscience},
                              year         = {2014},
                              volume       = {6},
                              number       = {7},
    			  
    			  url          = {http://www.frontiersin.org/synaptic_neuroscience/10.3389/fnsyn.2014.00007/abstract},
    			  
                              doi          = {http://doi.org/10.3389/fnsyn.2014.00007}
                            }
    			
    					
    Creutzig, F. & Sprekeler, H. 2008 Predictive coding and the slowness principle: an information-theoretic approach Neural Computation , 20(4) , 1026-1041 .
     
    article N.N.
    Abstract: Understanding the guiding principles of sensory coding strategies is a main goal in computational neuroscience. Among others, the principles of predictive coding and slowness appear to capture aspects of sensory processing. Predictive coding postulates that sensory systems are adapted to the structure of their input signals such that information about future inputs is encoded. Slow feature analysis (SFA) is a method for extracting slowly varying components from quickly varying input signals, thereby learning temporally invariant features. Here, we use the information bottleneck method to state an information-theoretic objective function for temporally local predictive coding. We then show that the linear case of SFA can be interpreted as a variant of predictive coding that maximizes the mutual information between the current output of the system and the input signal in the next time step. This demonstrates that the slowness principle and predictive coding are intimately related.
    BibTeX:
    			
                            @article{CreutzigSprekeler-2008,
                              author       = {Creutzig, Felix and Sprekeler, Henning},
                              title        = {Predictive coding and the slowness principle: an information-theoretic approach},
                              journal      = {Neural Computation},
                              year         = {2008},
                              volume       = {20},
                              number       = {4},
                              pages        = {1026--1041},
    			  
    			  url          = {http://dx.doi.org/10.1162/neco.2008.01-07-455},
    			  
                              doi          = {http://doi.org/10.1162/neco.2008.01-07-455}
                            }
    			
    					
    Dähne, S. 2010 Self-organization of V1 complex-cells based on Slow Feature Analysis and retinal waves Bernstein Center for Computational Neuroscience, Berlin Institute of Technology , Bernstein Center for Computational Neuroscience, Berlin Institute of Technology .
     
    mastersthesis SFA: Prenatal complex cells (2008-2010,2013)
    BibTeX:
    			
                            @mastersthesis{Daehne-2010,
                              author       = {Dähne, Sven},
                              title        = {Self-organization of {V1} complex-cells based on {S}low {F}eature {A}nalysis and retinal waves},
                              school       = {Bernstein Center for Computational Neuroscience, Berlin Institute of Technology},
                              year         = {2010},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/Dahne-2010-MasterThesis-SFARetinalWaves.pdf}
                            }
    			
    					
    Dähne, S.; Wilbert, N. & Wiskott, L. 2010 Learning complex cell units from simulated prenatal retinal waves using Slow Feature Analysis Interdisciplinary College 2010 .
     
    inproceedings SFA: Prenatal complex cells (2008-2010,2013)
    BibTeX:
    			
                            @inproceedings{DaehneWilbertEtAl-2010b,
                              author       = {Dähne, Sven and Wilbert, Niko and Wiskott, Laurenz},
                              title        = {Learning complex cell units from simulated prenatal retinal waves using {S}low {F}eature {A}nalysis},
                              booktitle    = {Interdisciplinary College 2010},
                              year         = {2010}
                            }
    			
    					
    Dähne, S.; Wilbert, N. & Wiskott, L. 2009 Learning complex cell units from simulated prenatal retinal waves with Slow Feature Analysis Proc. 18th Annual Computational Neuroscience Meeting (CNS'09), Jul 18-23, Berlin, Germany .
    (Special issue of BMC Neuroscience 10(Suppl 1):P129)  
    inproceedings SFA: Prenatal complex cells (2008-2010,2013)
    BibTeX:
    			
                            @inproceedings{DaehneWilbertEtAl-2009a,
                              author       = {Sven Dähne and Niko Wilbert and Laurenz Wiskott},
                              title        = {Learning complex cell units from simulated prenatal retinal waves with {S}low {F}eature {A}nalysis},
                              booktitle    = {Proc.\ 18th Annual Computational Neuroscience Meeting (CNS'09), Jul 18--23, Berlin, Germany},
                              year         = {2009},
    			  
    			  url          = {http://www.biomedcentral.com/1471-2202/10/S1/P129},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/DahneWilbertEtAl-2009a-ProcCNSBerlin-Abstract-SFARetinalWaves.pdf},
                              url3         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/DahneWilbertEtAl-2009a-ProcCNSBerlin-Poster-SFARetinalWaves.pdf},
    			  
                              doi          = {http://doi.org/10.1186/1471-2202-10-S1-P129}
                            }
    			
    					
    Dähne, S.; Wilbert, N. & Wiskott, L. 2009 Learning complex cell units from simulated prenatal retinal waves with Slow Feature Analysis Proc. 6'th International PhD Symposium Berlin Brain Days, Dec 9-11, Berlin, Germany .
     
    inproceedings SFA: Prenatal complex cells (2008-2010,2013)
    BibTeX:
    			
                            @inproceedings{DaehneWilbertEtAl-2009b,
                              author       = {Sven Dähne and Niko Wilbert and Laurenz Wiskott},
                              title        = {Learning complex cell units from simulated prenatal retinal waves with {S}low {F}eature {A}nalysis},
                              booktitle    = {Proc.\ 6'th International PhD Symposium Berlin Brain Days, Dec 9--11, Berlin, Germany},
                              year         = {2009}
                            }
    			
    					
    Dähne, S.; Wilbert, N. & Wiskott, L. 2010 Self-organization of V1 complex cells based on Slow Feature Analysis and retinal waves Proc. Bernstein Conference on Computational Neuroscience, Sep 27-Oct 1, Berlin, Germany .
     
    inproceedings SFA: Prenatal complex cells (2008-2010,2013)
    BibTeX:
    			
                            @inproceedings{DaehneWilbertEtAl-2010a,
                              author       = {S. Dähne and N. Wilbert and L. Wiskott},
                              title        = {Self-organization of {V1} complex cells based on {S}low {F}eature {A}nalysis and retinal waves},
                              booktitle    = {Proc.\ Bernstein Conference on Computational Neuroscience, Sep 27--Oct 1, Berlin, Germany},
                              year         = {2010},
    			  
    			  url          = {http://www.frontiersin.org/10.3389/conf.fncom.2010.51.00090/event_abstract},
    			  
                              doi          = {http://doi.org/10.3389/conf.fncom.2010.51.00090}
                            }
    			
    					
    Dähne, S.; Wilbert, N. & Wiskott, L. 2014 Slow Feature Analysis on retinal waves leads to V1 complex cells PLoS Comput Biol , 10(5) , e1003564 .
     
    article SFA: Prenatal complex cells (2008-2010,2013)
    Abstract: The developing visual system of many mammalian species is partially structured and organized even before the onset of vision. Spontaneous neural activity, which spreads in waves across the retina, has been suggested to play a major role in these prenatal structuring processes. Recently, it has been shown that when employing an efficient coding strategy, such as sparse coding, these retinal activity patterns lead to basis functions that resemble optimal stimuli of simple cells in primary visual cortex (V1). Here we present the results of applying a coding strategy that optimizes for temporal slowness, namely Slow Feature Analysis (SFA), to a biologically plausible model of retinal waves. Previously, SFA has been successfully applied to model parts of the visual system, most notably in reproducing a rich set of complex-cell features by training SFA with quasi-natural image sequences. In the present work, we obtain SFA units that share a number of properties with cortical complex-cells by training on simulated retinal waves. The emergence of two distinct properties of the SFA units (phase invariance and orientation tuning) is thoroughly investigated via control experiments and mathematical analysis of the input-output functions found by SFA. The results support the idea that retinal waves share relevant temporal and spatial properties with natural visual input. Hence, retinal waves seem suitable training stimuli to learn invariances and thereby shape the developing early visual system such that it is best prepared for coding input from the natural world.
    BibTeX:
    			
                            @article{DaehneWilbertEtAl-2014,
                              author       = {Sven Dähne and Niko Wilbert and Laurenz Wiskott},
                              title        = {Slow {F}eature {A}nalysis on retinal waves leads to {V1} complex cells},
                              journal      = {PLoS Comput Biol},
                              publisher    = {Public Library of Science},
                              year         = {2014},
                              volume       = {10},
                              number       = {5},
                              pages        = {e1003564},
    			  
    			  url          = {http://dx.doi.org/10.1371/journal.pcbi.1003564},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/DahneWilbertEtAl-2014-PLoSCompBiol-RetinalWaves.pdf},
    			  
                              doi          = {http://doi.org/10.1371/journal.pcbi.1003564}
                            }
    			
    					
    Doursat, R.; Konen, W.; Lades, M.; von der Malsburg, C.; Vorbrüggen, J.; Wiskott, L. & Würtz, R.P. 1993 Neural mechanisms of elastic pattern matching Technical report , IR-INI 93-01 .
     
    techreport Scene analysis (1992)
    BibTeX:
    			
                            @techreport{DoursatKonenEtAl-1993,
                              author       = {Ren{\'e} Doursat and Wolfgang Konen and Martin Lades and Christoph von der Malsburg and Jan Vorbrüggen and Laurenz Wiskott and Rolf P. Würtz},
                              title        = {Neural mechanisms of elastic pattern matching},
                              publisher    = {Institut für Neuroinformatik},
                              year         = {1993},
                              volume       = {IR-INI 93-01},
                              howpublished = {Technical report}
                            }
    			
    					
    Draht, F.; Zhang, S.; Rayan, A.; Schönfeld, F.; Wiskott, L. & Manahan-Vaughan, D. 2017 Experience-dependency of reliance on local visual and idiothetic cues for spatial representations created in the absence of distal information Frontiers in Behavioral Neuroscience , 11 , 92 .
     
    article N.N.
    Abstract: Spatial encoding in the hippocampus is based on a range of different input sources. To generate spatial representations, reliable sensory cues from the external environment are integrated with idiothetic cues, derived from self-movement, that enable path integration and directional perception. In this study, we examined to what extent idiothetic cues significantly contribute to spatial representations and navigation: we recorded place cells while rodents navigated towards two visually identical chambers in 180° orientation via two different paths in darkness and in the absence of reliable auditory or olfactory cues. Our goal was to generate a conflict between local visual and direction-specific information, and then to assess which strategy was prioritized in different learning phases. We observed that, in the absence of distal cues, place fields are initially controlled by local visual cues that override idiothetic cues, but that with multiple exposures to the paradigm, spaced at intervals of days, idiothetic cues become increasingly implemented in generating an accurate spatial representation. Taken together, these data support that, in the absence of distal cues, local visual cues are prioritized in the generation of context-specific spatial representations through place cells, whereby idiothetic cues are deemed unreliable. With cumulative exposures to the environments, the animal learns to attend to subtle idiothetic cues to resolve the conflict between visual and direction-specific information.
    BibTeX:
    			
                            @article{DrahtZhangEtAl-2017,
                              author       = {Fabian Draht and Sijie Zhang and Abdelrahman Rayan and Fabian Schönfeld and Laurenz Wiskott and Denise Manahan-Vaughan},
                              title        = {Experience-dependency of reliance on local visual and idiothetic cues for spatial representations created in the absence of distal information},
                              journal      = {Frontiers in Behavioral Neuroscience},
                              year         = {2017},
                              volume       = {11},
                              pages        = {92},
    			  
    			  url          = {http://journal.frontiersin.org/article/10.3389/fnbeh.2017.00092/full},
    			  
                              doi          = {http://doi.org/10.3389/fnbeh.2017.00092}
                            }
    			
    					
    Engelhardt, R.C.; Lange, M.; Wiskott, L. & Konen, W. 2023 Sample-Based Rule Extraction for Explainable Reinforcement Learning Proc. 9th International Conference on Machine Learning, Optimization, and Data Science (LOD) , Lecture Notes in Computer Science , 13810 , 330–345 .
     
    inproceedings
    Abstract: In this paper we propose a novel, phenomenological approach to explainable Reinforcement Learning (RL). While the ever-increasing performance of RL agents surpasses human capabilities on many problems, it falls short concerning explainability, which might be of minor importance when solving toy problems but is certainly a major obstacle for the application of RL in industrial and safety-critical processes. The literature contains different approaches to increase explainability of deep artificial networks. However, to our knowledge there is no simple, agent-agnostic method to extract human-readable rules from trained RL agents. Our approach is based on the idea of observing the agent and its environment during evaluation episodes and inducing a decision tree from the collected samples, obtaining an explainable mapping of the environment's state to the agent's corresponding action. We tested our idea on classical control problems provided by OpenAI Gym using handcrafted rules as a benchmark as well as trained deep RL agents with two different algorithms for decision tree induction. The extracted rules demonstrate how this new approach might be a valuable step towards the goal of explainable RL.
    BibTeX:
    			
                            @inproceedings{EngelhardtLangeEtAl-2023,
                              author       = {Engelhardt, Raphael C. and Lange, Moritz and Wiskott, Laurenz and Konen, Wolfgang},
                              title        = {Sample-Based Rule Extraction for Explainable Reinforcement Learning},
                              booktitle    = {Proc. 9th International Conference on Machine Learning, Optimization, and Data Science (LOD)},
                              publisher    = {Springer Nature Switzerland},
                              year         = {2023},
                              volume       = {13810},
                              pages        = {330–345},
    			  
                              doi          = {http://doi.org/10.1007/978-3-031-25599-1_25}
                            }
    			
    					
    Engelhardt, R.C.; Oedingen, M.; Lange, M.; Wiskott, L. & Konen, W. 2023 Iterative Oblique Decision Trees Deliver Explainable RL Models Algorithms , 16(6) .
     
    article
    Abstract: The demand for explainable and transparent models increases with the continued success of reinforcement learning. In this article, we explore the potential of generating shallow decision trees (DTs) as simple and transparent surrogate models for opaque deep reinforcement learning (DRL) agents. We investigate three algorithms for generating training data for axis-parallel and oblique DTs with the help of DRL agents (“oracles”) and evaluate these methods on classic control problems from OpenAI Gym. The results show that one of our newly developed algorithms, the iterative training, outperforms traditional sampling algorithms, resulting in well-performing DTs that often even surpass the oracle from which they were trained. Even higher dimensional problems can be solved with surprisingly shallow DTs. We discuss the advantages and disadvantages of different sampling methods and insights into the decision-making process made possible by the transparent nature of DTs. Our work contributes to the development of not only powerful but also explainable RL agents and highlights the potential of DTs as a simple and effective alternative to complex DRL models.
    BibTeX:
    			
                            @article{EngelhardtOedingenEtAl-2023,
                              author       = {Engelhardt, Raphael C. and Oedingen, Marc and Lange, Moritz and Wiskott, Laurenz and Konen, Wolfgang},
                              title        = {Iterative Oblique Decision Trees Deliver Explainable RL Models},
                              journal      = {Algorithms},
                              year         = {2023},
                              volume       = {16},
                              number       = {6},
    			  
    			  url          = {https://www.mdpi.com/1999-4893/16/6/282},
    			  
                              doi          = {http://doi.org/10.3390/a16060282}
                            }
    			
    					
    Escalante, A. & Wiskott, L. 2010 Gender and age estimation from synthetic face images with hierarchical Slow Feature Analysis International Conference on Information Processing and Management of Uncertainty in Knowledge-Based Systems (IPMU'10), Jun 28 - Jul 2, Dortmund .
     
    inproceedings Supervised Learning with GSFA (2009-2017)
    Abstract: Our ability to recognize the gender and estimate the age of people around us is crucial for our social development and interactions. In this paper, we investigate how to use Slow Feature Analysis (SFA) to estimate gender and age from synthetic face images. SFA is a versatile unsupervised learning algorithm that extracts slowly varying features from a multidimensional signal. To process very high-dimensional data, such as images, SFA can be applied hierarchically. The key idea here is to construct the training signal such that the parameters of interest, namely gender and age, vary slowly. This makes the labelling of the data implicit in the training signal and permits the use of the unsupervised algorithm in a hierarchical fashion. A simple supervised step at the very end is then sufficient to extract gender and age with high reliability. Gender was estimated with a very high accuracy, and age had an RMSE of 3.8 years for test images.
    BibTeX:
    			
                            @inproceedings{EscalanteWiskott-2010,
                              author       = {Alberto Escalante and Laurenz Wiskott},
                              title        = {Gender and age estimation from synthetic face images with hierarchical {S}low {F}eature {A}nalysis},
                              booktitle    = {International Conference on Information Processing and Management of Uncertainty in Knowledge-Based Systems (IPMU'10), Jun 28 -- Jul 2, Dortmund},
                              year         = {2010},
    			  
    			  url          = {http://www.springerlink.com/content/r031104qv7228r35},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/EscalanteWiskott-2010-IPMU-AgeGenderEstimation-Preprint.pdf},
    			  
                              doi          = {http://doi.org/10.1007/978-3-642-14049-5_25}
                            }
    			
    					
    Escalante, A. & Wiskott, L. 2011 Heuristic evaluation of expansions for non-linear hierarchical Slow Feature Analysis Proc. The 10th Intl. Conf. on Machine Learning and Applications (ICMLA'11), Dec 18-21, Honolulu, Hawaii , 133-f138 .
     
    inproceedings N. N.
    Abstract: Slow Feature Analysis (SFA) is a feature extraction algorithm based on the slowness principle with applications to both supervised and unsupervised learning. When implemented hierarchically, it allows for efficient processing of high-dimensional data, such as images. Expansion plays a crucial role in the implementation of non-linear SFA. In this paper, a fast heuristic method for the evaluation of expansions is proposed, consisting of tests on seven problems and two metrics. Several expansions with different complexities are evaluated. It is shown that the method allows predictions of the performance of SFA on a concrete data set, and the use of normalized expansions is justified. The proposed method is useful for the design of powerful expansions that allow the extraction of complex high-level features and provide better generalization.
    BibTeX:
    			
                            @inproceedings{EscalanteWiskott-2011,
                              author       = {Alberto Escalante and Laurenz Wiskott},
                              title        = {Heuristic evaluation of expansions for non-linear hierarchical {S}low {F}eature {A}nalysis},
                              booktitle    = {Proc.\ The 10th Intl.\ Conf.\ on Machine Learning and Applications (ICMLA'11), Dec 18--21, Honolulu, Hawaii},
                              publisher    = {IEEE Computer Society},
                              year         = {2011},
                              pages        = {133--f138},
    			  
    			  url          = {http://ieeexplore.ieee.org/document/6146957/},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/EscalanteWiskott-2011-ICMLA-Expansions-Preprint.pdf},
    			  
                              doi          = {http://doi.org/10.1109/ICMLA.2011.72}
                            }
    			
    					
    Escalante-B., A.N. 2017 Extensions of Hierarchical Slow Feature Analysis for efficient classification and regression on high-dimensional data Ruhr University Bochum, Faculty of Electrical Engineering and Information Technology , Ruhr University Bochum, Faculty of Electrical Engineering and Information Technology .
     
    phdthesis Supervised Learning with GSFA (2009-2017)
    Abstract: This thesis develops new extensions to slow feature analysis (SFA) that solve supervised learning problems (e.g., classification and regression) on high-dimensional data (e.g., images) in an efficient, accurate, and principled way. This type of problems has been addressed by convolutional neural networks (CNNs) in the last decade with excellent results. However, additional approaches would be valuable, specially those that are conceptually novel and whose design can be justified theoretically. SFA is an algorithm originally designed for unsupervised learning that extracts slow (i.e., temporally stable) features. Advantages of SFA include a strong theoretical foundation and that it might be intimately connected to learning in biological systems. One can apply SFA to high-dimensional data if it is implemented hierarchically, a technique called hierarchical SFA (HSFA). The extensions to SFA listed in the following allow the construction and training of deep HSFA networks, yielding competitive accuracy and efficiency. Graph-based SFA (GSFA) is a supervised extension to SFA that introduces the concept of training graph, a structure in which the vertices are samples (e.g., images) and edges represent transitions between pairs of samples. Edges have weights that can be interpreted as desired output similarities of the corresponding samples. Compared to SFA, GSFA solves a more general optimization problem and considers many more transitions. Information about label (or class) similarities is encoded in the graph by the strength of the edge weights. Many training graphs are proposed to handle regression and classification problems. The efficacy of GSFA is demonstrated on a subproblem of face detection. The exact label learning (ELL) method allows to compute training graphs where the slowest feature(s) one could extract would be equal to the label(s), if the feature space were unrestricted. In contrast to previously proposed graphs, the edge weights of the resulting ELL graphs are set precisely as needed, improving the label estimation accuracy. Moreover, ELL allows to learn multiple labels simultaneously using a single network, which is more efficient than learning the labels separately and often results in more robust features. Hierarchical information-preserving GSFA (HiGSFA) improves the amount of label information propagated from the input to the top node in hierarchical GSFA (HGSFA). HiGSFA computes two types of features: slow features that maximize slowness, as usual, and reconstructive features that minimize an input reconstruction error, following an information-preservation goal. HiGSFA is evaluated on the problem of age estimation (along with gender and race) from facial photographs, where it yields a mean average error of 3.50 years, outperforming current state-of-the-art systems. Among the proposed extensions, HiGSFA is the most promising. HiGSFA incorporates the other extensions and yields the best results, making this approach competitive, scalable, and robust. Moreover, HiGSFA is a versatile algorithm, allowing new technical applications and further principled extensions.
    BibTeX:
    			
                            @phdthesis{Escalante-B.-2017,
                              author       = {Alberto N. Escalante-B.},
                              title        = {Extensions of {H}ierarchical {S}low {F}eature {A}nalysis for efficient classification and regression on high-dimensional data},
                              school       = {Ruhr University Bochum, Faculty of Electrical Engineering and Information Technology},
                              year         = {2017},
    			  
    			  url          = {http://hss-opus.ub.ruhr-uni-bochum.de/opus4/frontdoor/index/index/docId/5388}
                            }
    			
    					
    Escalante-B., A.N. & Wiskott, L. 2012 Slow Feature Analysis: perspectives for technical applications of a versatile learning algorithm Künstliche Intelligenz [Artificial Intelligence] , 26(4) , 341-348 .
     
    article SFA: Estimating driving forces (2000-2003), Extended slow feature analysis (xSFA) (2006-2013), Supervised Learning with GSFA (2009-2017)
    Abstract: Slow Feature Analysis (SFA) is an unsupervised learning algorithm based on the slowness principle and has originally been developed to learn invariances in a model of the primate visual system. Although developed for computational neuroscience, SFA has turned out to be a versatile algorithm also for technical applications since it can be used for feature extraction, dimensionality reduction, and invariance learning. With minor adaptations SFA can also be applied to supervised learning problems such as classification and regression. In this work, we review several illustrative examples of possible applications including the estimation of driving forces, nonlinear blind source separation, traffic sign recognition, and face processing.
    BibTeX:
    			
                            @article{Escalante-B.Wiskott-2012a,
                              author       = {Alberto N. Escalante-B. and Laurenz Wiskott},
                              title        = {Slow {F}eature {A}nalysis: perspectives for technical applications of a versatile learning algorithm},
                              journal      = {Künstliche Intelligenz [Artificial Intelligence]},
                              year         = {2012},
                              volume       = {26},
                              number       = {4},
                              pages        = {341--348},
    			  
    			  url          = {http://www.springerlink.com/content/vk3738325250162k/},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/EscalanteWiskott-2012a-KI-SFATechnicalApplications-Preprint.pdf},
    			  
                              doi          = {http://doi.org/10.1007/s13218-012-0190-7}
                            }
    			
    					
    Escalante-B., A.N. & Wiskott, L. 2012 How to solve classification and regression problems on real data with Slow Feature Analysis 21st Machine Learning Summer School, Aug 27 - Sep 7, Kyoto University, Japan Poster .
     
    inproceedings Supervised Learning with GSFA (2009-2017)
    BibTeX:
    			
                            @inproceedings{Escalante-B.Wiskott-2012b,
                              author       = {Alberto N. Escalante-B. and Laurenz Wiskott},
                              title        = {How to solve classification and regression problems on real data with {S}low {F}eature {A}nalysis},
                              booktitle    = {21st Machine Learning Summer School, Aug 27 -- Sep 7, Kyoto University, Japan},
                              year         = {2012},
                              howpublished = {Poster}
                            }
    			
    					
    Escalante-B., A.N. & Wiskott, L. 2013 How to solve classification and regression problems on high-dimensional data with a supervised extension of Slow Feature Analysis Journal of Machine Learning Research , 14 , 3683-3719 .
     
    article Supervised Learning with GSFA (2009-2017)
    Abstract: Supervised learning from high-dimensional data, e.g., multimedia data, is a challenging task. We propose an extension of slow feature analysis (SFA) for supervised dimensionality reduction called graph-based SFA (GSFA). The algorithm extracts a label-predictive low-dimensional set of features that can be post-processed by typical supervised algorithms to generate the final label or class estimation. GSFA is trained with a so-called training graph, in which the vertices are the samples and the edges represent similarities of the corresponding labels. A new weighted SFA optimization problem is introduced, generalizing the notion of slowness from sequences of samples to such training graphs. We show that GSFA computes an optimal solution to this problem in the considered function space, and propose several types of training graphs. For classification, the most straightforward graph yields features equivalent to those of (nonlinear) Fisher discriminant analysis. Emphasis is on regression, where four different graphs were evaluated experimentally with a subproblem of face detection on photographs. The method proposed is promising particularly when linear models are insufficient, as well as when feature selection is difficult.
    BibTeX:
    			
                            @article{Escalante-B.Wiskott-2013b,
                              author       = {Alberto N. Escalante-B. and Laurenz Wiskott},
                              title        = {How to solve classification and regression problems on high-dimensional data with a supervised extension of {S}low {F}eature {A}nalysis},
                              journal      = {Journal of Machine Learning Research},
                              year         = {2013},
                              volume       = {14},
                              pages        = {3683--3719},
    			  
    			  url          = {http://jmlr.org/papers/v14/escalante13a.html}
                            }
    			
    					
    Escalante-B., A.N. & Wiskott, L. 2015 Theoretical analysis of the optimal free responses of graph-based SFA for the design of training graphs CoRR e-print arXiv:1509.08329 .
     
    misc Supervised Learning with GSFA (2009-2017)
    BibTeX:
    			
                            @misc{Escalante-B.Wiskott-2015,
                              author       = {Alberto N. Escalante-B. and Laurenz Wiskott},
                              title        = {Theoretical analysis of the optimal free responses of graph-based {SFA} for the design of training graphs},
                              journal      = {CoRR},
                              year         = {2015},
                              howpublished = {e-print arXiv:1509.08329},
    			  
    			  url          = {https://arxiv.org/abs/1509.08329}
                            }
    			
    					
    Escalante-B., A.N. & Wiskott, L. 2016 Improved graph-based SFA: information preservation complements the slowness principle CoRR e-print arXiv:1601.03945 .
    (Submitted to Pattern Recognition)  
    misc Supervised Learning with GSFA (2009-2017)
    Abstract: Slow feature analysis (SFA) is an unsupervised-learning algorithm that extracts slowly varying features from a multi-dimensional time series. A supervised extension to SFA for classification and regression is graph-based SFA (GSFA). GSFA is based on the preservation of similarities, which are specified by a graph structure derived from the labels. It has been shown that hierarchical GSFA (HGSFA) allows learning from images and other high-dimensional data. The feature space spanned by HGSFA is complex due to the composition of the nonlinearities of the nodes in the network. However, we show that the network discards useful information prematurely before it reaches higher nodes, resulting in suboptimal global slowness and an under-exploited feature space. To counteract these problems, we propose an extension called hierarchical information-preserving GSFA (HiGSFA), where information preservation complements the slowness-maximization goal. We build a 10-layer HiGSFA network to estimate human age from facial photographs of the MORPH-II database, achieving a mean absolute error of 3.50 years, improving the state-of-the-art performance. HiGSFA and HGSFA support multiple-labels and offer a rich feature space, feed-forward training, and linear complexity in the number of samples and dimensions. Furthermore, HiGSFA outperforms HGSFA in terms of feature slowness, estimation accuracy and input reconstruction, giving rise to a promising hierarchical supervised-learning approach.
    BibTeX:
    			
                            @misc{Escalante-B.Wiskott-2016a,
                              author       = {Alberto N. Escalante-B. and Laurenz Wiskott},
                              title        = {Improved graph-based {SFA}: information preservation complements the slowness principle},
                              journal      = {CoRR},
                              year         = {2016},
                              howpublished = {e-print arXiv:1601.03945},
    			  
    			  url          = {https://arxiv.org/abs/1601.03945}
                            }
    			
    					
    Escalante-B., A.N. & Wiskott, L. 2016 Theoretical analysis of the optimal free responses of graph-based SFA for the design of training graphs Journal of Machine Learning Research , 17(157) , 1-36 .
     
    article Supervised Learning with GSFA (2009-2017)
    Abstract: Slow feature analysis (SFA) is an unsupervised learning algorithm that extracts slowly varying features from a multi-dimensional time series. Graph-based SFA (GSFA) is an extension to SFA for supervised learning that can be used to successfully solve regression problems if combined with a simple supervised post-processing step on a small number of slow features. The objective function of GSFA minimizes the squared output di
    erences between pairs of samples speci ed by the edges of a structure called training graph. The edges of current training graphs, however, are derived only from the relative order of the labels. Exploiting the exact numerical value of the labels enables further improvements in label estimation accuracy. In this article, we propose the exact label learning (ELL) method to create a more precise training graph that encodes the desired labels explicitly and allows GSFA to extract a normalized version of them directly (i.e., without supervised post-processing). The ELL method is used for three tasks: (1)We estimate gender from arti cial images of human faces (regression) and show the advantage of coding additional labels, particularly skin color. (2) We analyze two existing graphs for regression. (3) We extract compact discriminative features to classify trac sign images. When the number of output features is limited, such compact features provide a higher classi cation rate compared to a graph that generates features equivalent to those of nonlinear Fisher discriminant analysis. The method is versatile, directly supports multiple labels, and provides higher accuracy compared to current graphs for the problems considered.
    BibTeX:
    			
                            @article{Escalante-B.Wiskott-2016b,
                              author       = {Alberto N. Escalante-B. and Laurenz Wiskott},
                              title        = {Theoretical analysis of the optimal free responses of graph-based {SFA} for the design of training graphs},
                              journal      = {Journal of Machine Learning Research},
                              year         = {2016},
                              volume       = {17},
                              number       = {157},
                              pages        = {1--36},
    			  
    			  url          = {http://jmlr.org/papers/v17/15-311.html}
                            }
    			
    					
    Escalante-B., A.N. & Wiskott, L. 2019 Improved graph-based SFA: information preservation complements the slowness principle Machine Learning , 109 , 999-1037 .
     
    article
    Abstract: Slow feature analysis (SFA) is an unsupervised learning algorithm that extracts slowly varying features from a multi-dimensional time series. SFA has been extended to supervised learning (classification and regression) by an algorithm called graph-based SFA (GSFA). GSFA relies on a particular graph structure to extract features that preserve label similarities. Processing of high dimensional input data (e.g., images) is feasible via hierarchical GSFA (HGSFA), resulting in a multi-layer neural network. Although HGSFA has useful properties, in this work we identify a shortcoming, namely, that HGSFA networks prematurely discard quickly varying but useful features before they reach higher layers, resulting in suboptimal global slowness and an under-exploited feature space. To counteract this shortcoming, which we call unnecessary information loss, we propose an extension called hierarchical information-preserving GSFA (HiGSFA), where some features fulfill a slowness objective and other features fulfill an information preservation objective. The efficacy of the extension is verified in three experiments: (1) an unsupervised setup where the input data is the visual stimuli of a simulated rat, (2) the localization of faces in image patches, and (3) the estimation of human age from facial photographs of the MORPH-II database. Both HiGSFA and HGSFA can learn multiple labels and offer a rich feature space, feed-forward training, and linear complexity in the number of samples and dimensions. However, the proposed algorithm, HiGSFA, outperforms HGSFA in terms of feature slowness, estimation accuracy, and input reconstruction, giving rise to a promising hierarchical supervised-learning approach. Moreover, for age estimation, HiGSFA achieves a mean absolute error of 3.41 years, which is a competitive performance for this challenging problem.
    BibTeX:
    			
                            @article{Escalante-B.Wiskott-2019,
                              author       = {Escalante-B., Alberto N. and Wiskott, Laurenz},
                              title        = {Improved graph-based SFA: information preservation complements the slowness principle},
                              journal      = {Machine Learning},
                              year         = {2019},
                              volume       = {109},
                              pages        = {999--1037},
    			  
    			  url          = {https://doi.org/10.1007/s10994-019-05860-9},
    			  
                              doi          = {http://doi.org/10.1007/s10994-019-05860-9}
                            }
    			
    					
    Escalante-B., A.-N. & Wiskott, L. 2013 How to solve classification and regression problems on high-dimensional data with a supervised extension of Slow Feature Analysis Cognitive Sciences EPrint Archive (CogPrints) , 8966 .
     
    misc Supervised Learning with GSFA (2009-2017)
    BibTeX:
    			
                            @misc{Escalante-B.Wiskott-2013a,
                              author       = {Alberto-N. Escalante-B. and Laurenz Wiskott},
                              title        = {How to solve classification and regression problems on high-dimensional data with a supervised extension of {S}low {F}eature {A}nalysis},
                              year         = {2013},
                              volume       = {8966},
                              howpublished = {Cognitive Sciences EPrint Archive (CogPrints)},
    			  
    			  url          = {http://cogprints.org/8966/}
                            }
    			
    					
    Fang, J.; Rüther, N.; Bellebaum, C.; Wiskott, L. & Cheng, S. 2018 The Interaction between Semantic Representation and Episodic Memory Neural Computation , 30(2) , 293-332 .
     
    article
    BibTeX:
    			
                            @article{FangRuetherEtAl-2018,
                              author       = {Fang, Jing and R{\"u}ther, Naima and Bellebaum, Christian and Wiskott, Laurenz and Cheng, Sen},
                              title        = {The Interaction between Semantic Representation and Episodic Memory},
                              journal      = {Neural Computation},
                              year         = {2018},
                              volume       = {30},
                              number       = {2},
                              pages        = {293-332},
    			  
                              doi          = {http://doi.org/10.1162/neco_a_01044}
                            }
    			
    					
    Fayyaz, Z.; Altamimi, A.; Cheng, S. & Wiskott, L. 2021 A model of semantic completion in generative episodic memory .
     
    misc
    BibTeX:
    			
                            @misc{FayyazAltamimiEtAl-2021,
                              author       = {Zahra Fayyaz and Aya Altamimi and Sen Cheng and Laurenz Wiskott},
                              title        = {A model of semantic completion in generative episodic memory},
                              year         = {2021},
    			  
                              doi          = {http://doi.org/10.48550/arXiv.2111.13537}
                            }
    			
    					
    Fayyaz, Z.; Altamimi, A.; Zoellner, C.; Klein, N.; Wolf, O.T.; Cheng, S. & Wiskott, L. 2022 A Model of Semantic Completion in Generative Episodic Memory Neural Computation , 34(9) , 1841–1870 .
     
    article
    BibTeX:
    			
                            @article{FayyazAltamimiEtAl-2022,
                              author       = {Fayyaz, Zahra and Altamimi, Aya and Zoellner, Carina and Klein, Nicole and Wolf, Oliver T. and Cheng, Sen and Wiskott, Laurenz},
                              title        = {A Model of Semantic Completion in Generative Episodic Memory},
                              journal      = {Neural Computation},
                              year         = {2022},
                              volume       = {34},
                              number       = {9},
                              pages        = {1841–1870},
    			  
                              doi          = {http://doi.org/10.1162/neco_a_01520}
                            }
    			
    					
    Fellous, J.-M.; Wiskott, L.; Krüger, N. & von der Malsburg, C. 1997 Face recognition by Elastic Bunch Graph Matching Proc. Intl. Conf. on Vision, Recognition, Action: Neural Models of Mind and Machine .
     
    inproceedings Face recognition with EBGM (1993,1994)
    BibTeX:
    			
                            @inproceedings{FellousWiskottEtAl-1997,
                              author       = {Jean-Marc Fellous and Laurenz Wiskott and Norbert Krüger and Christoph von der Malsburg},
                              title        = {Face recognition by {E}lastic {B}unch {G}raph {M}atching},
                              booktitle    = {Proc.\ Intl.\ Conf.\ on Vision, Recognition, Action: Neural Models of Mind and Machine},
                              year         = {1997}
                            }
    			
    					
    Franzius, M. 2008 Slowness and sparseness for unsupervised learning of spatial and object codes from naturalistic data Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät I , Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät I .
     
    phdthesis SFA: Place cells I (2003-2007), SFA: Learning visual invariances II (2006-2009)
    BibTeX:
    			
                            @phdthesis{Franzius-2008,
                              author       = {Mathias Franzius},
                              title        = {Slowness and sparseness for unsupervised learning of spatial and object codes from naturalistic data},
                              school       = {Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät I},
                              year         = {2008},
    			  
    			  url          = {http://edoc.hu-berlin.de/docviews/abstract.php?id=29124},
    			  
                              doi          = {http://doi.org/10.18452/15784}
                            }
    			
    					
    Franzius, M.; Sprekeler, H. & Wiskott, L. 2007 Slowness and sparseness lead to place-, head direction-, and spatial-view cells Proc. 3rd Annual Computational Cognitive Neuroscience Conference, Nov 1-2, San Diego, USA , III-8 .
     
    inproceedings SFA: Place cells I (2003-2007)
    BibTeX:
    			
                            @inproceedings{FranziusSprekelerEtAl-2007e,
                              author       = {Mathias Franzius and Henning Sprekeler and Laurenz Wiskott},
                              title        = {Slowness and sparseness lead to place-, head direction-, and spatial-view cells},
                              booktitle    = {Proc.\ 3rd Annual Computational Cognitive Neuroscience Conference, Nov 1--2, San Diego, USA},
                              year         = {2007},
                              pages        = {III-8}
                            }
    			
    					
    Franzius, M.; Sprekeler, H. & Wiskott, L. 2007 Unsupervised learning of visually driven place cells in the hippocampus Kognitionsforschung 2007, Beiträge zur 8. Jahrestagung der Gesellschaft für Kognitionswissenschaft (KogWis'07), Mar 19-21, Saarbrücken, Germany , 60 .
     
    inproceedings SFA: Place cells I (2003-2007)
    BibTeX:
    			
                            @inproceedings{FranziusSprekelerEtAl-2007a,
                              author       = {M. Franzius and H. Sprekeler and L. Wiskott},
                              title        = {Unsupervised learning of visually driven place cells in the hippocampus},
                              booktitle    = {Kognitionsforschung 2007, Beiträge zur 8. Jahrestagung der Gesellschaft für Kognitionswissenschaft (KogWis'07), Mar 19-21, Saarbrücken, Germany},
                              publisher    = {Shaker Verlag},
                              year         = {2007},
                              pages        = {60}
                            }
    			
    					
    Franzius, M.; Sprekeler, H. & Wiskott, L. 2006 Slowness leads to place cells Proc. Berlin Neuroscience Forum, Jun 8-10, Bad Liebenwalde, Germany , 42 .
     
    inproceedings SFA: Place cells I (2003-2007)
    BibTeX:
    			
                            @inproceedings{FranziusSprekelerEtAl-2006a,
                              author       = {M. Franzius and H. Sprekeler and L. Wiskott},
                              title        = {Slowness leads to place cells},
                              booktitle    = {Proc.\ Berlin Neuroscience Forum, Jun 8--10, Bad Liebenwalde, Germany},
                              publisher    = {Max-Delbrück-Centrum für Molekulare Medizin (MDC)},
                              year         = {2006},
                              pages        = {42}
                            }
    			
    					
    Franzius, M.; Sprekeler, H. & Wiskott, L. 2006 Slowness leads to place cells Proc. 2nd Bernstein Symposium for Computational Neuroscience, Oct 1-3, Berlin, Germany , 45 .
     
    inproceedings SFA: Place cells I (2003-2007)
    BibTeX:
    			
                            @inproceedings{FranziusSprekelerEtAl-2006b,
                              author       = {M. Franzius and H. Sprekeler and L. Wiskott},
                              title        = {Slowness leads to place cells},
                              booktitle    = {Proc.\ 2nd Bernstein Symposium for Computational Neuroscience, Oct 1--3, Berlin, Germany},
                              publisher    = {Bernstein Center for Computational Neuroscience (BCCN) Berlin},
                              year         = {2006},
                              pages        = {45}
                            }
    			
    					
    Franzius, M.; Sprekeler, H. & Wiskott, L. 2006 Slowness leads to place cells Proc. 15th Annual Computational Neuroscience Meeting (CNS'06), Jul 16-20, Edinburgh, Scotland .
    (Poster T5)  
    inproceedings SFA: Place cells I (2003-2007)
    BibTeX:
    			
                            @inproceedings{FranziusSprekelerEtAl-2006c,
                              author       = {M. Franzius and H. Sprekeler and L. Wiskott},
                              title        = {Slowness leads to place cells},
                              booktitle    = {Proc.\ 15th Annual Computational Neuroscience Meeting (CNS'06), Jul 16--20, Edinburgh, Scotland},
                              year         = {2006}
                            }
    			
    					
    Franzius, M.; Sprekeler, H. & Wiskott, L. 2007 Unsupervised learning of place cells and head direction cells with slow feature analysis Proc. 7th Göttingen Meeting of the German Neuroscience Society, Mar 29 - Apr 1, Göttingen, Germany , TS19-1C .
     
    inproceedings SFA: Place cells I (2003-2007)
    BibTeX:
    			
                            @inproceedings{FranziusSprekelerEtAl-2007b,
                              author       = {M. Franzius and H. Sprekeler and L. Wiskott},
                              title        = {Unsupervised learning of place cells and head direction cells with slow feature analysis},
                              booktitle    = {Proc.\ 7th Göttingen Meeting of the German Neuroscience Society, Mar 29 -- Apr 1, Göttingen, Germany},
                              year         = {2007},
                              pages        = {TS19--1C}
                            }
    			
    					
    Franzius, M.; Sprekeler, H. & Wiskott, L. 2007 Learning of place cells, head-direction cells, and spatial-view cells with Slow Feature Analysis on quasi-natural videos Cognitive Sciences EPrint Archive (CogPrints) , 5492 .
     
    misc SFA: Place cells I (2003-2007)
    BibTeX:
    			
                            @misc{FranziusSprekelerEtAl-2007c,
                              author       = {Mathias Franzius and Henning Sprekeler and Laurenz Wiskott},
                              title        = {Learning of place cells, head-direction cells, and spatial-view cells with {S}low {F}eature {A}nalysis on quasi-natural videos},
                              year         = {2007},
                              volume       = {5492},
                              howpublished = {Cognitive Sciences EPrint Archive (CogPrints)},
    			  
    			  url          = {http://cogprints.org/5711/}
                            }
    			
    					
    Franzius, M.; Sprekeler, H. & Wiskott, L. 2007 Slowness and sparseness lead to place, head-direction, and spatial-view cells PLoS Computational Biology , 3(8) , e166 .
     
    article SFA: Place cells I (2003-2007)
    Abstract: We present a model for the self-organized formation of place cells, head-direction cells, and spatial-view cells in the hippocampal formation based on unsupervised learning on quasi-natural visual stimuli. The model comprises a hierarchy of Slow Feature Analysis (SFA) nodes, which were recently shown to reproduce many properties of complex cells in the early visual system [1]. The system extracts a distributed grid-like representation of position and orientation, which is transcoded into a localized place-field, head-direction, or view representation, by sparse coding. The type of cells that develops depends solely on the relevant input statistics, i.e., the movement pattern of the simulated animal. The numerical simulations are complemented by a mathematical analysis that allows us to accurately predict the output of the top SFA layer.
    BibTeX:
    			
                            @article{FranziusSprekelerEtAl-2007d,
                              author       = {Mathias Franzius and Henning Sprekeler and Laurenz Wiskott},
                              title        = {Slowness and sparseness lead to place, head-direction, and spatial-view cells},
                              journal      = {PLoS Computational Biology},
                              year         = {2007},
                              volume       = {3},
                              number       = {8},
                              pages        = {e166},
    			  
    			  url          = {http://www.ploscompbiol.org/article/info:doi/10.1371/journal.pcbi.0030166},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/FranziusSprekelerEtAl-2007d-PLoSCompBiol-PlaceCells.pdf},
    			  
                              doi          = {http://doi.org/10.1371/journal.pcbi.0030166}
                            }
    			
    					
    Franzius, M.; Vollgraf, R. & Wiskott, L. 2006 From grids to places Cognitive Sciences EPrint Archive (CogPrints) , 5101 .
     
    misc From Grids to Places (2005, 2006)
    BibTeX:
    			
                            @misc{FranziusVollgrafEtAl-2006,
                              author       = {Mathias Franzius and Roland Vollgraf and Laurenz Wiskott},
                              title        = {From grids to places},
                              year         = {2006},
                              volume       = {5101},
                              howpublished = {Cognitive Sciences EPrint Archive (CogPrints)},
    			  
    			  url          = {http://cogprints.org/5101/}
                            }
    			
    					
    Franzius, M.; Vollgraf, R. & Wiskott, L. 2007 From grids to places Journal of Computational Neuroscience , 22(3) , 297-299 .
     
    article From Grids to Places (2005, 2006)
    Abstract: Hafting et al. (2005) described grid cells in the dorsocaudal region of the medial enthorinal cortex (dMEC). These cells show a strikingly regular grid-like firing-pattern as a function of the position of a rat in an enclosure. Since the dMEC projects to the hippocampal areas containing the well-known place cells, the question arises whether and how the localized responses of the latter can emerge based on the output of grid cells. Here, we show that, starting with simulated grid-cells, a simple linear transformation maximizing sparseness leads to a localized representation similar to place fields.
    BibTeX:
    			
                            @article{FranziusVollgrafEtAl-2007,
                              author       = {Mathias Franzius and Roland Vollgraf and Laurenz Wiskott},
                              title        = {From grids to places},
                              journal      = {Journal of Computational Neuroscience},
                              year         = {2007},
                              volume       = {22},
                              number       = {3},
                              pages        = {297--299},
    			  
    			  url          = {http://www.springerlink.com/content/r6lj66670057871q/},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/FranziusVollgrafEtAl-2007-JCompNeurosci-GridsToPlaces-Preprint.pdf},
    			  
                              doi          = {http://doi.org/10.1007/s10827-006-0013-7}
                            }
    			
    					
    Franzius, M.; Wilbert, N. & Wiskott, L. 2008 Invariant object recognition with Slow Feature Analysis Proc. 18th Intl. Conf. on Artificial Neural Networks (ICANN'08), Prague , Lecture Notes in Computer Science , 5163 , 961-970 .
     
    inproceedings SFA: Learning visual invariances II (2006-2009)
    BibTeX:
    			
                            @inproceedings{FranziusWilbertEtAl-2008b,
                              author       = {Mathias Franzius and Niko Wilbert and Laurenz Wiskott},
                              title        = {Invariant object recognition with {S}low {F}eature {A}nalysis},
                              booktitle    = {Proc.\ 18th Intl.\ Conf.\ on Artificial Neural Networks (ICANN'08), Prague},
                              publisher    = {Springer},
                              year         = {2008},
                              volume       = {5163},
                              pages        = {961--970},
    			  
    			  url          = {http://dx.doi.org/10.1007/978-3-540-87536-9_98},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/FranziusWilbertEtAl-2008b-ProcICANN-SFAInvariances2D.pdf},
    			  
                              doi          = {http://doi.org/10.1007/978-3-540-87536-9_98}
                            }
    			
    					
    Franzius, M.; Wilbert, N. & Wiskott, L. 2007 Unsupervised learning of invariant 3D-object representations with Slow Feature Analysis Proc. 3rd Bernstein Symposium for Computational Neuroscience, Sep 24-27, Göttingen, Germany , 105 .
     
    inproceedings SFA: Learning visual invariances II (2006-2009)
    BibTeX:
    			
                            @inproceedings{FranziusWilbertEtAl-2007,
                              author       = {Mathias Franzius and Niko Wilbert and Laurenz Wiskott},
                              title        = {Unsupervised learning of invariant 3{D}-object representations with {S}low {F}eature {A}nalysis},
                              booktitle    = {Proc.\ 3rd Bernstein Symposium for Computational Neuroscience, Sep 24--27, Göttingen, Germany},
                              publisher    = {Bernstein Center for Computational Neuroscience (BCCN) Göttingen},
                              year         = {2007},
                              pages        = {105}
                            }
    			
    					
    Franzius, M.; Wilbert, N. & Wiskott, L. 2008 Unsupervised learning of invariant 3D-object and pose representations with Slow Feature Analysis Proc. Federation of European Neuroscience Societies Forum (FENS'08), Jul 12-16, Geneva, Switzerland .
     
    inproceedings SFA: Learning visual invariances II (2006-2009)
    BibTeX:
    			
                            @inproceedings{FranziusWilbertEtAl-2008a,
                              author       = {Mathias Franzius and Niko Wilbert and Laurenz Wiskott},
                              title        = {Unsupervised learning of invariant 3{D}-object and pose representations with {S}low {F}eature {A}nalysis},
                              booktitle    = {Proc.\ Federation of European Neuroscience Societies Forum (FENS'08), Jul 12--16, Geneva, Switzerland},
                              year         = {2008}
                            }
    			
    					
    Franzius, M.; Wilbert, N. & Wiskott, L. 2011 Invariant object recognition and pose estimation with Slow Feature Analysis Neural Computation , 23(9) , 2289-2323 .
     
    article SFA: Learning visual invariances II (2006-2009)
    Abstract: Primates are very good at recognizing objects independent of viewing angle or retinal position, and they outperform existing computer vision systems by far. But invariant object recognition is only one prerequisite for successful interaction with the environment. An animal also needs to assess an object's position and relative rotational angle. We propose here a model that is able to extract object identity, position, and rotation angles. We demonstrate the model behavior on complex three-dimensional objects under translation and rotation in depth on a homogeneous background. A similar model has previously been shown to extract hippocampal spatial codes from quasi-natural videos. The framework for mathematical analysis of this earlier application carries over to the scenario of invariant object recognition. Thus, the simulation results can be explained analytically even for the complex high-dimensional data we employed.
    BibTeX:
    			
                            @article{FranziusWilbertEtAl-2011,
                              author       = {Franzius, Mathias and Wilbert, Niko and Wiskott, Laurenz},
                              title        = {Invariant object recognition and pose estimation with {S}low {F}eature {A}nalysis},
                              journal      = {Neural Computation},
                              year         = {2011},
                              volume       = {23},
                              number       = {9},
                              pages        = {2289--2323},
    			  
    			  url          = {http://www.mitpressjournals.org/doi/pdf/10.1162/NECO_a_00171},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/FranziusWilbertEtAl-2011-NeurComp.pdf},
    			  
                              doi          = {http://doi.org/10.1162/NECO_a_00171}
                            }
    			
    					
    Freiwald, J.; Karbasi, M.; Zeiler, S.; Melchior, J.; Kompella, V.; Wiskott, L. & Kolossa, D. 2018 Utilizing Slow Feature Analysis for Lipreading Speech Communication; 13th ITG-Symposium , 191-195 .
     
    inproceedings
    Abstract: While speech recognition has become highly robust in the recent past, it
    is still a challenging task under very noisy or reverberant conditions.
    Augmenting speech recognition by lipreading from video input is hence a
    promising approach to make speech recognition more reliable.
    For this purpose, we consider slow feature analysis (SFA), an
    unsupervised machine learning method that finds temporally slowest
    varying features in sequential input data.
    It can automatically extract temporally slow features within a video sequence,
    such as lip movements, while at the same time removing quickly changing
    components such as noise.
    In this work, we apply SFA as an initial feature
    extraction step to the task of automatic lipreading.
    The performance is evaluated on small-vocabulary lipreading, both in
    the speaker-dependent and speaker-independent case, showing that the
    features are competitive to the often highly successful combination of a
    discrete cosine transform and a linear discriminant analysis, while also
    offering good interpretability.
    BibTeX:
    			
                            @inproceedings{FreiwaldKarbasiEtAl-2018,
                              author       = {J. Freiwald and M. Karbasi and S. Zeiler and J. Melchior and V. Kompella and L. Wiskott and D. Kolossa},
                              title        = {Utilizing {S}low {F}eature {A}nalysis for Lipreading},
                              booktitle    = {Speech Communication; 13th ITG-Symposium},
                              publisher    = {{VDE} {Verlag} {GmbH}},
                              year         = {2018},
                              pages        = {191--195},
    			  
    			  url          = {https://ieeexplore.ieee.org/document/8578021}
                            }
    			
    					
    Görler, R.; Wiskott, L. & Cheng, S. 2020 Improving sensory representations using episodic memory Hippocampus , 30(6) , 638-656 .
     
    article
    BibTeX:
    			
                            @article{GoerlerWiskottEtAl-2020,
                              author       = {G{\"o}rler, Richard and Wiskott, Laurenz and Cheng, Sen},
                              title        = {Improving sensory representations using episodic memory},
                              journal      = {Hippocampus},
                              publisher    = {Wiley Online Library},
                              year         = {2020},
                              volume       = {30},
                              number       = {6},
                              pages        = {638--656},
    			  
                              doi          = {http://doi.org/10.1002/hipo.23186}
                            }
    			
    					
    Ha Quang, M.; Ha Kang, S. & Le, T.M. 2009 Reproducing kernels and colorization Proc. 8th International Conference on Sampling Theory and Applications (SAMPTA), May 18-22, Marseille, France .
     
    inproceedings N.N.
    BibTeX:
    			
                            @inproceedings{HaQuangHaKangEtAl-2009,
                              author       = {Ha Quang, Minh and Ha Kang, Sung and Triet Minh Le},
                              title        = {Reproducing kernels and colorization},
                              booktitle    = {Proc.\ 8th International Conference on Sampling Theory and Applications (SAMPTA), May 18--22, Marseille, France},
                              year         = {2009}
                            }
    			
    					
    Ha Quang, M.; Ha Kang, S. & Le, T.M. 2010 Image and video colorization using vector-valued reproducing kernel Hilbert spaces Journal of Mathematical Imaging and Vision , 37(1) , 49-65 .
     
    article N.N.
    BibTeX:
    			
                            @article{HaQuangHaKangEtAl-2010,
                              author       = {Ha Quang, Minh and Ha Kang, Sung and Triet Minh Le},
                              title        = {Image and video colorization using vector-valued reproducing kernel {H}ilbert spaces},
                              journal      = {Journal of Mathematical Imaging and Vision},
                              year         = {2010},
                              volume       = {37},
                              number       = {1},
                              pages        = {49--65},
    			  
    			  url          = {http://itb.biologie.hu-berlin.de/~minh/color_01_15_2010_final.pdf},
    			  
                              doi          = {http://doi.org/10.1007/s10851-010-0192-8}
                            }
    			
    					
    Ha Quang, M.; Pillonetto, G. & Chiuso, A. 2009 Nonlinear system identification via Gaussian regression and mixtures of kernels Proc. 15th IFAC Symposium on System Identification (SYSID), Jul 6-8, Saint-Malo, France .
     
    inproceedings N.N.
    BibTeX:
    			
                            @inproceedings{HaQuangPillonettoEtAl-2009,
                              author       = {Ha Quang, Minh and Gianluigi Pillonetto and Alessandro Chiuso},
                              title        = {Nonlinear system identification via {G}aussian regression and mixtures of kernels},
                              booktitle    = {Proc.\ 15th IFAC Symposium on System Identification (SYSID), Jul 6--8, Saint-Malo, France},
                              year         = {2009},
    			  
    			  url          = {http://www.sciencedirect.com/science/article/pii/S1474667016387018},
    			  
                              doi          = {http://doi.org/10.3182/20090706-3-fr-2004.00087}
                            }
    			
    					
    Ha Quang, M. & Wiskott, L. 2011 Slow Feature Analysis and decorrelation filtering for separating correlated sources Proc. 13th International Conference on Computer Vision (ICCV), Nov 6-13, Barcelona, Spain , 866-873 .
     
    inproceedings Separating correlated sources (2009-2012)
    BibTeX:
    			
                            @inproceedings{HaQuangWiskott-2011,
                              author       = {Ha Quang, Minh and Wiskott, L.},
                              title        = {Slow {F}eature {A}nalysis and decorrelation filtering for separating correlated sources},
                              booktitle    = {Proc.\ 13th International Conference on Computer Vision (ICCV), Nov 6-13, Barcelona, Spain},
                              year         = {2011},
                              pages        = {866--873},
    			  
    			  url          = {http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=6126327&abstractAccess=no&userType=inst},
    			  
                              doi          = {http://doi.org/10.1109/ICCV.2011.6126327}
                            }
    			
    					
    Ha Quang, M. & Wiskott, L. 2013 Multivariate Slow Feature Analysis and decorrelation filtering for blind source separation IEEE Trans. on Image Processing , 22(7) , 2737-2750 .
     
    article Separating correlated sources (2009-2012)
    Abstract: We generalize the method of Slow Feature Analysis (SFA) for vector-valued functions of several variables and apply it to the problem of blind source separation, in particular to image separation. It is generally necessary to use multivariate SFA instead of univariate SFA for separating multi-dimensional signals. For the linear case, an exact mathematical analysis is given, which shows in particular that the sources are perfectly separated by SFA if and only if they and their first order derivatives are uncorrelated. When the sources are correlated, we apply the following technique called decorrelation filtering: use a linear filter to decorrelate the sources and their derivatives in the given mixture, then apply the unmixing matrix obtained on the filtered mixtures to the original mixtures. If the filtered sources are perfectly separated by this matrix, so are the original sources. A decorrelation filter can be numerically obtained by solving a nonlinear optimization problem. This technique can also be applied to other linear separation methods, whose output signals are decorrelated, such as ICA. When there are more mixtures than sources, one can determine the actual number of sources by using a regularized version of SFA with decorrelation filtering. Extensive numerical experiments using SFA and ICA with decorrelation filtering, supported by mathematical analysis, demonstrate the potential of our methods for solving problems involving blind source separation.
    BibTeX:
    			
                            @article{HaQuangWiskott-2013,
                              author       = {Ha Quang, Minh and Wiskott, L.},
                              title        = {Multivariate {S}low {F}eature {A}nalysis and decorrelation filtering for blind source separation},
                              journal      = {IEEE Trans.\ on Image Processing},
                              year         = {2013},
                              volume       = {22},
                              number       = {7},
                              pages        = {2737--2750},
    			  
    			  url          = {http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6497610},
    			  
                              doi          = {http://doi.org/10.1109/TIP.2013.2257808}
                            }
    			
    					
    Hinze, C. 2012 Optimale Stimuli in einem hierarchischen SFA-Netzwerk Klinik für Neurologie der Charité Berlin, Institut für theoretische Biologie der Humboldt Universität Berlin , Klinik für Neurologie der Charité Berlin, Institut für theoretische Biologie der Humboldt Universität Berlin .
     
    phdthesis Analysis of higher-level receptive fields (2007-2010)
    BibTeX:
    			
                            @phdthesis{Hinze-2012,
                              author       = {Christian Hinze},
                              title        = {Optimale {S}timuli in einem hierarchischen {SFA}-{N}etzwerk},
                              school       = {Klinik für Neurologie der Charité Berlin, Institut für theoretische Biologie der Humboldt Universität Berlin},
                              year         = {2012},
    			  
    			  url          = {http://www.diss.fu-berlin.de/diss/servlets/MCRFileNodeServlet/FUDISS_derivate_000000013478/},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/Hinze-2012-Doktorarbeit.pdf}
                            }
    			
    					
    Hinze, C.; Wilbert, N. & Wiskott, L. 2009 Visualization of higher-level receptive fields in a hierarchical model of the visual system Proc. 18th Annual Computational Neuroscience Meeting (CNS'09), Jul 18-23, Berlin, Germany .
    (Special issue of BMC Neuroscience 10(Suppl 1):P158)  
    inproceedings Analysis of higher-level receptive fields (2007-2010)
    Abstract: Early visual receptive fields have been measured extensively and are fairly well mapped. Receptive fields in higher areas, on the other hand, are very difficult to characterize, because it is not clear what they are tuned to and which stimuli to use to study them. Early visual receptive fields have been reproduced by computational models. Slow feature analysis (SFA), for instance, is an algorithm that finds functions that extract most slowly varying features from a multi-dimensional input sequence [1]. Applied to quasi-natural image sequences, i.e. image sequences derived from natural images by translation, rotation and zoom, SFA yields many properties of complex cells in V1 [2]. A hierarchical network of SFA units learns invariant object representations much like in IT [3]. These successes suggest that units of intermediate layers in the network might share properties with cells in V2 or V4. The goal of this project is therefore to develop techniques to visualize and characterize such units to understand how cells in V2/V4 might work. This is nontrivial because the units are highly nonlinear. The algorithm is gradient-based and applied in a cascade within the network. We start with a natural image patch as an input, which then gets optimized by gradient ascent to maximize the output of one particular unit. Figure 1 shows such optimal stimuli for units in the first (a, b) and the second layer (c, d). The latter can be associated with cells in V2/V4. We plan to extend this to higher layers and larger receptive fields and will also develop techniques to visualize the invariances of the units, i.e. those variations to the input that have little effect on the unit's output. The long-term goal is to provide a good stimulus set for characterizing cells in V2/V4. [Figure] Figure 1. Optimal stimuli of units in the first layer (a, b) and the second layer (c, d) of a hierarchical SFA network optimized for slowness and trained with quasi-natural image sequences. References 1. Wiskott L, Sejnowski TJ: Slow feature analysis: Unsupervised learning of invariances. Neural Computation 2002, 14:715-770. 2. Berkes P, Wiskott L: Slow feature analysis yields a rich repertoire of complex cell properties. J Vision 2005, 5:579-602. 3. Franzius M, Wilbert N, Wiskott L: Invariant object recognition with slow feature analysis. Proc 18th Int'l Conf on Artificial Neural Networks 2008, 961-970.
    BibTeX:
    			
                            @inproceedings{HinzeWilbertEtAl-2009,
                              author       = {Christian Hinze and Niko Wilbert and Laurenz Wiskott},
                              title        = {Visualization of higher-level receptive fields in a hierarchical model of the visual system},
                              booktitle    = {Proc.\ 18th Annual Computational Neuroscience Meeting (CNS'09), Jul 18--23, Berlin, Germany},
                              year         = {2009},
    			  
    			  url          = {http://www.biomedcentral.com/1471-2202/10/S1/P158},
                              url3         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/HinzeWilbertEtAl-2009-ProcCNSBerlin-Poster-HigherRFs.pdf},
    			  
                              doi          = {http://doi.org/10.1186/1471-2202-10-S1-P158}
                            }
    			
    					
    Hlynsson, H.D. 2021 Visual processing in context of reinforcement learning Ruhr-University Bochum, Institute of Neural Computation , Ruhr-University Bochum, Institute of Neural Computation .
     
    phdthesis
    BibTeX:
    			
                            @phdthesis{Hlynsson-2021,
                              author       = {Hlynsson, Hlynur Davíð},
                              title        = {Visual processing in context of reinforcement learning},
                              school       = {Ruhr-University Bochum, Institute of Neural Computation},
                              year         = {2021},
    			  
    			  url          = {https://arxiv.org/abs/2208.12525}
                            }
    			
    					
    Hlynsson, H.D.; Escalante-B., A.N. & Wiskott, L. 2019 Measuring the Data Efficiency of Deep Learning Methods CoRR e-print arXiv:1907.02549 .
     
    misc
    BibTeX:
    			
                            @misc{HlynssonEscalante-B.EtAl-2019a,
                              author       = {Hlynur Davíð Hlynsson and Alberto N. Escalante-B. and Laurenz Wiskott},
                              title        = {Measuring the Data Efficiency of Deep Learning Methods},
                              journal      = {CoRR},
                              year         = {2019},
                              howpublished = {e-print arXiv:1907.02549},
    			  
    			  url          = {https://arxiv.org/abs/1907.02549}
                            }
    			
    					
    Hlynsson, H.D.; Escalante-B., A.N. & Wiskott, L. 2019 Measuring the Data Efficiency of Deep Learning Methods Proceedings of the 8th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM, , 691-698 .
     
    inproceedings
    BibTeX:
    			
                            @inproceedings{HlynssonEscalante-B.EtAl-2019b,
                              author       = {Hlynur Davíð Hlynsson and Alberto N. Escalante-B. and Laurenz Wiskott},
                              title        = {Measuring the Data Efficiency of Deep Learning Methods},
                              booktitle    = {Proceedings of the 8th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM,},
                              publisher    = {SciTePress},
                              year         = {2019},
                              pages        = {691-698},
    			  
                              doi          = {http://doi.org/10.5220/0007456306910698}
                            }
    			
    					
    Hlynsson, H.D.; Schüler, M.; Schiewer, R.; Glasmachers, T. & Wiskott, L. 2020 Latent Representation Prediction Networks arXiv preprint arXiv:2009.09439 .
     
    misc
    BibTeX:
    			
                            @misc{HlynssonSchuelerEtAl-2020,
                              author       = {Hlynsson, Hlynur Davíð and Schüler, Merlin and Schiewer, Robin and Glasmachers, Tobias and Wiskott, Laurenz},
                              title        = {Latent Representation Prediction Networks},
                              journal      = {arXiv preprint arXiv:2009.09439},
                              year         = {2020},
    			  
    			  url          = {https://arxiv.org/abs/2009.09439}
                            }
    			
    					
    Hlynsson, H.D.; Schüler, M.; Schiewer, R.; Glasmachers, T. & Wiskott, L. 2022 Latent Representation Prediction Networks International Journal of Pattern Recognition and Artificial Intelligence , 36(01) , 2251002 .
     
    article
    Abstract: Modern model-based reinforcement learning methods for high-dimensional inputs often incorporate an unsupervised learning step for dimensionality reduction. The training objective of these unsupervised learning methods often leverages only static inputs such as reconstructing observations. These representations are combined with predictor functions for simulating rollouts to navigate the environment. We advance this idea by taking advantage of the fact that we navigate dynamic environments with visual stimulus and create a representation that is specifically designed with control and actions in mind. We propose to learn a feature map that is maximally predictable for a predictor function. This results in representations that are well suited for the task of planning, where the predictor is used as a forward model. To this end, we introduce a new way of learning this representation along with the prediction function, a system we dub Latent Representation Prediction Network (LARP). The prediction function is used as a forward model for a search on a graph in a viewpoint-matching task, and the representation learned to maximize predictability is found to outperform other representations. The sample efficiency and overall performance of our approach are shown to rival standard reinforcement learning methods, and our learned representation transfers successfully to unseen environments.
    BibTeX:
    			
                            @article{HlynssonSchuelerEtAl-2022,
                              author       = {Hlynsson, Hlynur David and Sch\"{u}ler, Merlin and Schiewer, Robin and Glasmachers, Tobias and Wiskott, Laurenz},
                              title        = {Latent Representation Prediction Networks},
                              journal      = {International Journal of Pattern Recognition and Artificial Intelligence},
                              year         = {2022},
                              volume       = {36},
                              number       = {01},
                              pages        = {2251002},
    			  
    			  url          = {https://doi.org/10.1142/S0218001422510028},
    			  
                              doi          = {http://doi.org/10.1142/S0218001422510028}
                            }
    			
    					
    Hlynsson, H.D. & Wiskott, L. 2019 Learning Gradient-Based ICA by Neurally Estimating Mutual Information KI 2019: Advances in Artificial Intelligence , 182–187 .
     
    inproceedings
    BibTeX:
    			
                            @inproceedings{HlynssonWiskott-2019b,
                              author       = {Hlynsson, Hlynur Dav′ið and Wiskott, Laurenz},
                              title        = {Learning Gradient-Based ICA by Neurally Estimating Mutual Information},
                              booktitle    = {KI 2019: Advances in Artificial Intelligence},
                              publisher    = {Springer International Publishing},
                              year         = {2019},
                              pages        = {182–187},
    			  
    			  url          = {https://link.springer.com/chapter/10.1007/978-3-030-30179-8_15}
                            }
    			
    					
    Hlynsson, H.D. & Wiskott, L. 2019 Learning gradient-based ICA by neurally estimating mutual information CoRR e-print arXiv:1904.09858 .
     
    misc
    BibTeX:
    			
                            @misc{HlynssonWiskott-2019a,
                              author       = {Hlynsson, Hlynur Davíð and Wiskott, Laurenz},
                              title        = {Learning gradient-based ICA by neurally estimating mutual information},
                              journal      = {CoRR},
                              year         = {2019},
                              howpublished = {e-print arXiv:1904.09858},
    			  
    			  url          = {https://arxiv.org/abs/1904.09858}
                            }
    			
    					
    Hlynsson, H.D. & Wiskott, L. 2021 Reward prediction for representation learning and reward shaping .
     
    misc
    BibTeX:
    			
                            @misc{HlynssonWiskott-2021a,
                              author       = {Hlynsson, Hlynur Davíð and Wiskott, Laurenz},
                              title        = {Reward prediction for representation learning and reward shaping},
                              publisher    = {arXiv},
                              year         = {2021},
    			  
    			  url          = {https://arxiv.org/abs/2105.03172},
    			  
                              doi          = {http://doi.org/10.48550/ARXIV.2105.03172}
                            }
    			
    					
    Hlynsson, H.D. & Wiskott, L. 2021 Reward Prediction for Representation Learning and Reward Shaping Proceedings of the 13th International Joint Conference on Computational Intelligence - Volume 1: NCTA, , 267-276 .
     
    inproceedings
    BibTeX:
    			
                            @inproceedings{HlynssonWiskott-2021b,
                              author       = {Hlynsson, Hlynur Davíð and Wiskott, Laurenz},
                              title        = {Reward Prediction for Representation Learning and Reward Shaping},
                              booktitle    = {Proceedings of the 13th International Joint Conference on Computational Intelligence - Volume 1: NCTA,},
                              publisher    = {SciTePress},
                              year         = {2021},
                              pages        = {267-276},
    			  
                              doi          = {http://doi.org/10.5220/0010640200003063}
                            }
    			
    					
    Kempermann, G. & Wiskott, L. 2004 What is the functional role of new neurons in the adult dentate gyrus? Proc. Stem Cells in the Nervous System: Functional and Clinical Implications 2003, Jan 20, Paris, France , Research and Perspectives in Neurosciences (Fondation Ipsen) , 57-65 .
     
    inproceedings Adult neurogenesis: Function I (2000-2003)
    BibTeX:
    			
                            @inproceedings{KempermannWiskott-2004,
                              author       = {Gerd Kempermann and Laurenz Wiskott},
                              title        = {What is the functional role of new neurons in the adult dentate gyrus?},
                              booktitle    = {Proc.\ Stem Cells in the Nervous System:~Functional and Clinical Implications 2003, Jan 20, Paris, France},
                              publisher    = {Springer},
                              year         = {2004},
                              pages        = {57--65},
    			  
    			  url          = {https://link.springer.com/chapter/10.1007/978-3-642-18883-1_4},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/KempermannWiskott-2004-Paris-Neurogenesis-Preprint.pdf},
    			  
                              doi          = {http://doi.org/10.1007/978-3-642-18883-1_4}
                            }
    			
    					
    Kempermann, G.; Wiskott, L. & Gage, F.H. 2004 Functional significance of adult neurogenesis Curr. Opin. Neurobiol. , 14(2) , 186-191 .
     
    article Adult neurogenesis: Function I (2000-2003)
    Abstract: 'Function' is the key criterion for determining whether adult neurogenesis - be it endogenous, induced, or after transplantation - is successful and has truly generated new nerve cells. Function, however, is an elusive and problematic term. A satisfying statement of function will require evaluation on the three conceptual levels of cells, networks and systems - and potentially even beyond, on the level of psychology. Neuronal development is a lengthy process, a fact that must be considered when neuronal development has to be considered when judging causes and consequences in experiments that address function and function-dependent regulation of adult neurogenesis. Nevertheless, the information that has been obtained and published so far provides ample evidence that adult-generated neurons can function and even suggests how they might contribute to cognitive processes.
    BibTeX:
    			
                            @article{KempermannWiskottEtAl-2004,
                              author       = {Gerd Kempermann and Laurenz Wiskott and Fred H. Gage},
                              title        = {Functional significance of adult neurogenesis},
                              journal      = {Curr.\ Opin.\ Neurobiol.},
                              year         = {2004},
                              volume       = {14},
                              number       = {2},
                              pages        = {186--191},
    			  
    			  url          = {http://www.sciencedirect.com/science/article/pii/S0959438804000339},
    			  
                              doi          = {http://doi.org/10.1016/j.conb.2004.03.001}
                            }
    			
    					
    Kirste, I.; Lezius, S.; Kronenberg, G.; Klempin, F.; Wiskott, L. & Kempermann, G. 2011 Dynamics of neuronal development in the adult hippocampus: PP-032 Regenerative Medicine , 6(6) .
     
    article
    BibTeX:
    			
                            @article{KirsteLeziusEtAl-2011,
                              author       = {Kirste, I and Lezius, S and Kronenberg, G and Klempin, F and Wiskott, L and Kempermann, G},
                              title        = {Dynamics of neuronal development in the adult hippocampus: PP-032},
                              journal      = {Regenerative Medicine},
                              year         = {2011},
                              volume       = {6},
                              number       = {6}
                            }
    			
    					
    Kompella, V.R. & Wiskott, L. 2017 Intrinsically motivated acquisition of modular slow features for humanoids in continuous and non-stationary environments CoRR e-print arXiv:1701.04663 [cs.AI] .
     
    misc N.N.
    Abstract: A compact information-rich representation of the environment, also called a feature abstraction, can simplify a robot's task of mapping its raw sensory inputs to useful action sequences. However, in environments that are non-stationary and only partially observable, a single abstraction is probably not sufficient to encode most variations. Therefore, learning multiple sets of spatially or temporally local, modular abstractions of the inputs would be beneficial. How can a robot learn these local abstractions without a teacher? More specifically, how can it decide from where and when to start learning a new abstraction? A recently proposed algorithm called Curious Dr. MISFA addresses this problem. The algorithm is based on two underlying learning principles called artificial curiosity and slowness. The former is used to make the robot self-motivated to explore by rewarding itself whenever it makes progress learning an abstraction; the later is used to update the abstraction by extracting slowly varying components from raw sensory inputs. Curious Dr. MISFA's application is, however, limited to discrete domains constrained by a pre-defined state space and has design limitations that make it unstable in certain situations. This paper presents a significant improvement that is applicable to continuous environments, is computationally less expensive, simpler to use with fewer hyper parameters, and stable in certain non-stationary environments. We demonstrate the efficacy and stability of our method in a vision-based robot simulator.
    BibTeX:
    			
                            @misc{KompellaWiskott-2017,
                              author       = {Varun Raj Kompella and Laurenz Wiskott},
                              title        = {Intrinsically motivated acquisition of modular slow features for humanoids in continuous and non-stationary environments},
                              journal      = {CoRR},
                              year         = {2017},
                              howpublished = {e-print arXiv:1701.04663 [cs.AI]},
    			  
    			  url          = {https://arxiv.org/abs/1701.04663}
                            }
    			
    					
    Krüger, N.; Janssen, P.; Kalkan, S.; Lappe, M.; Leonardis, A.; Piater, J.; Rodriguez-Sánchez, A. & Wiskott, L. 2013 Deep hierarchies in the primate visual cortex: what can we learn for computer vision? IEEE Trans. on Pattern Analysis and Machine Intelligence , 35(8) , 1847-1871 .
     
    article N.N.
    Abstract: Computational modeling of the primate visual system yields insights of potential relevance to some of the challenges that computer vision is facing, such as object recognition and categorization, motion detection and activity recognition or vision-based navigation and manipulation. This article reviews some functional principles and structures that are generally thought to underlie the primate visual cortex, and attempts to extract biological principles that could further advance computer vision research. Organized for a computer vision audience, we present functional principles of the processing hierarchies present in the primate visual system considering recent discoveries in neurophysiology. The hierarchal processing in the primate visual system is characterized by a sequence of different levels of processing (in the order of ten) that constitute a deep hierarchy in contrast to the flat vision architectures predominantly used in today's mainstream computer vision. We hope that the functional description of the deep hierarchies realized in the primate visual system provides valuable insights for the design of computer vision algorithms, fostering increasingly productive interaction between biological and computer vision research.
    BibTeX:
    			
                            @article{KruegerJanssenEtAl-2013,
                              author       = {Krüger, N. and Janssen, P. and Kalkan, S. and Lappe, M. and Leonardis, A. and Piater, J. and Rodriguez-Sánchez, A. and Wiskott, L.},
                              title        = {Deep hierarchies in the primate visual cortex: what can we learn for computer vision?},
                              journal      = {IEEE Trans.\ on Pattern Analysis and Machine Intelligence},
                              year         = {2013},
                              volume       = {35},
                              number       = {8},
                              pages        = {1847--1871},
    			  
    			  url          = {http://ieeexplore.ieee.org/xpl/articleDetails.jsp?tp=&arnumber=6389683},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/KruegerJanssenEtAl-2013-IEEEPAMI.pdf},
    			  
                              doi          = {http://doi.org/10.1109/TPAMI.2012.272}
                            }
    			
    					
    Lange, M.; Krystiniak, N.; Engelhardt, R.C.; Konen, W. & Wiskott, L. 2023 Improving Reinforcement Learning Efficiency with Auxiliary Tasks in Non-Visual Environments: A Comparison .
     
    misc
    BibTeX:
    			
                            @misc{LangeKrystiniakEtAl-2023a,
                              author       = {Moritz Lange and Noah Krystiniak and Raphael C. Engelhardt and Wolfgang Konen and Laurenz Wiskott},
                              title        = {Improving Reinforcement Learning Efficiency with Auxiliary Tasks in Non-Visual Environments: A Comparison},
                              year         = {2023},
    			  
                              doi          = {http://doi.org/10.48550/arXiv.2310.04241}
                            }
    			
    					
    Lange, M.; Krystiniak, N.; Engelhardt, R.C.; Konen, W. & Wiskott, L. 2023 Improving Reinforcement Learning Efficiency with Auxiliary Tasks in Non-Visual Environments: A Comparison Proc. 9th International Conference on Machine Learning, Optimization and Data science (LOD) .
    (Best Paper Award)  
    inproceedings
    BibTeX:
    			
                            @inproceedings{LangeKrystiniakEtAl-2023b,
                              author       = {Lange, Moritz and Krystiniak, Noah and Engelhardt, Raphael C. and Konen, Wolfgang and Wiskott, Laurenz},
                              title        = {Improving Reinforcement Learning Efficiency with Auxiliary Tasks in Non-Visual Environments: A Comparison},
                              booktitle    = {Proc. 9th International Conference on Machine Learning, Optimization and Data science (LOD)},
                              year         = {2023}
                            }
    			
    					
    Legenstein, R.; Wilbert, N. & Wiskott, L. 2010 Reinforcement learning on slow features of high-dimensional input streams PLoS Comput Biol , 6(8) , e1000894 .
     
    article SFA and RL on visual input (2008,2009)
    Abstract: Humans and animals are able to learn complex behaviors based on a massive stream of sensory information from different modalities. Early animal studies have identified learning mechanisms that are based on reward and punishment such that animals tend to avoid actions that lead to punishment whereas rewarded actions are reinforced. However, most algorithms for reward-based learning are only applicable if the dimensionality of the state-space is sufficiently small or its structure is sufficiently simple. Therefore, the question arises how the problem of learning on high-dimensional data is solved in the brain. In this article, we propose a biologically plausible generic two-stage learning system that can directly be applied to raw high-dimensional input streams. The system is composed of a hierarchical slow feature analysis (SFA) network for preprocessing and a simple neural network on top that is trained based on rewards. We demonstrate by computer simulations that this generic architecture is able to learn quite demanding reinforcement learning tasks on high-dimensional visual input streams in a time that is comparable to the time needed when an explicit highly informative low-dimensional state-space representation is given instead of the high-dimensional visual input. The learning speed of the proposed architecture in a task similar to the Morris water maze task is comparable to that found in experimental studies with rats. This study thus supports the hypothesis that slowness learning is one important unsupervised learning principle utilized in the brain to form efficient state representations for behavioral learning.
    BibTeX:
    			
                            @article{LegensteinWilbertEtAl-2010,
                              author       = {Robert Legenstein and Niko Wilbert and Laurenz Wiskott},
                              title        = {Reinforcement learning on slow features of high-dimensional input streams},
                              journal      = {PLoS Comput Biol},
                              publisher    = {Public Library of Science},
                              year         = {2010},
                              volume       = {6},
                              number       = {8},
                              pages        = {e1000894},
    			  
    			  url          = {http://dx.doi.org/10.1371/journal.pcbi.1000894},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/LegensteinWilbertEtAl-2010-PLoSCompBiol.pdf},
    			  
                              doi          = {http://doi.org/10.1371/journal.pcbi.1000894}
                            }
    			
    					
    Lezius, S. 2007 Statistik und Modellierung der Dynamik adulter hippocampaler Neurogenese bei Mäusen Diploma thesis, Department of Mathematics and Computer Science , Ernst-Moritz-Arndt-University Greifswald, D-17487 Greifswald, Germany .
     
    mastersthesis Adult neurogenesis: Dynamics II (2006-2013)
    BibTeX:
    			
                            @mastersthesis{Lezius-2007,
                              author       = {Susanne Lezius},
                              title        = {Statistik und {M}odellierung der {D}ynamik adulter hippocampaler {N}eurogenese bei {M}äusen},
                              school       = {Department of Mathematics and Computer Science},
                              year         = {2007}
                            }
    			
    					
    Lezius, S.; Kirste, I.; Bandt, C.; Kempermann, G. & Wiskott, L. 2009 Quantitative modeling of the dynamics of adult hippocampal neurogenesis in mice Proc. 18th Annual Computational Neuroscience Meeting (CNS'09), Jul 18-23, Berlin, Germany .
    (Special issue of BMC Neuroscience 10(Suppl 1):P335)  
    inproceedings Adult neurogenesis: Dynamics II (2006-2013)
    Abstract: The hippocampus is special in that it generates new neurons throughout life. This development of new granule cells in the adult dentate gyrus is referred to as adult hippocampal neurogenesis. A kinetic model of this development has been established [1]. Therein the process of neurogenesis is composed of a sequence of different cell types. However, the exact dynamics of neuronal development in the dentate gyrus are unknown. To quantify the development we collected time-series like data. By injections of BrdU, the dividing cells were labeled and cell numbers could be counted at different time points after injection (2 h to 21 d). We determined relative numbers of BrdU-positive cells of the respective types. These numbers allow us to monitor the development of a labeled cell cohort through the different cell types over the analyzed time period. We also determined absolute cell numbers of different unlabeled populations. They do not show a time dependence, which leads to the idea of a dynamic equilibrium of cells of the different cell types on the analyzed timescale. Based on the known properties of the process and a prior model we established a detailed mathematical model containing the different developmental stages. Here we used the idea of the Leslie matrix, which is a discrete and age-structured model of population growth. The transition probabilities were found by fitting the parameters of the model to the data. We also included a simulated labeling process by which the initial cell populations are determined in a self-consistent manner based on the transition probabilities of the model. Furthermore, the effect of label dilution is included by applying a sigmoidal detectability function. The results of the model match well the data (see figure 1). The model enables us to deduce rates for division and death of cells of the different cell types as well as properties of the labeling process. Finally, based on the eigenvectors of the transition matrix we derive an estimate for the population of unlabeled cells, which matches the experimental data well without being fitted to them. [Figure] Figure 1. The comparison of the model output and the data shows good agreement for all cell types. References 1. Kempermann G, Jessberger S, Steiner B, Kronenberg G: Milestones of neuronal development in the adult hippocampus. Trends Neurosci 2004, 27:447-452.
    BibTeX:
    			
                            @inproceedings{LeziusKirsteEtAl-2009,
                              author       = {Susanne Lezius and Imke Kirste and Christoph Bandt and Gerd Kempermann and Laurenz Wiskott},
                              title        = {Quantitative modeling of the dynamics of adult hippocampal neurogenesis in mice},
                              booktitle    = {Proc.\ 18th Annual Computational Neuroscience Meeting (CNS'09), Jul 18--23, Berlin, Germany},
                              year         = {2009},
    			  
    			  url          = {http://www.biomedcentral.com/1471-2202/10/S1/P335},
                              url3         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/LeziusKirsteEtAl-2009-ProcCNSBerlin-Poster-NurogenesisDynamics2.pdf},
    			  
                              doi          = {http://doi.org/10.1186/1471-2202-10-S1-P335}
                            }
    			
    					
    Maeyer, L.D.; Nicola, A.D.; Maetche, R.; von der Malsburg, C. & Wiskott, L. 1989 An experimental multiprocessor system for distributed parallel computations Microprocessing and Microprogramming , 26 , 305-317 .
     
    article Multiprocessor system (1986-1988)
    Abstract: The availability of low-cost microprocessor chips with efficient instruction sets for specific numerical tasks (signal processors) has been exploited for building a versatile multiprocessor system, consisting of a host minicomputer augmented by a number of joint processors. The host provides a multiuser-multitasking environment and manages system resources and task scheduling. User applications can call upon one or more joint processors for parallel execution of adequately partitioned, computationally intensive numeric operations. Each joint processor has sufficient local memory for storing procedures and data and has access to regions in host memory for shared data. Kernel processes in the host and in the joint processors provide the necessary mechanism for initialization and synchronization of the distributed parallel execution of procedures.
    BibTeX:
    			
                            @article{MaeyerNicolaEtAl-1989,
                              author       = {L. De Maeyer and A. Di Nicola and R. Maetche and Christoph von der Malsburg and Laurenz Wiskott},
                              title        = {An experimental multiprocessor system for distributed parallel computations},
                              journal      = {Microprocessing and Microprogramming},
                              year         = {1989},
                              volume       = {26},
                              pages        = {305--317},
    			  
    			  url          = {http://www.sciencedirect.com/science/article/pii/016560749090330C?via=ihub},
    			  
                              doi          = {http://doi.org/10.1016/0165-6074(90)90330-C}
                            }
    			
    					
    Melchior, J. 2012 Learning natural image statistics with Gaussian-binary Restricted Boltzmann Machines ET-IT Dept., Ruhr-University Bochum, Germany , ET-IT Dept., Ruhr-University Bochum, Germany .
     
    mastersthesis RBM: Modeling natural images (2012-2014)
    BibTeX:
    			
                            @mastersthesis{Melchior-2012,
                              author       = {Jan Melchior},
                              title        = {Learning natural image statistics with {G}aussian-binary {R}estricted {B}oltzmann {M}achines},
                              school       = {ET-IT Dept., Ruhr-University Bochum, Germany},
                              year         = {2012},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/Melchior-2012-MasterThesis-RBMs.pdf}
                            }
    			
    					
    Melchior, J. 2020 On the importance of centering in artificial neural networks doctoralthesis, Ruhr-Universität Bochum, Universitätsbibliothek , Ruhr-Universität Bochum, Universitätsbibliothek .
     
    phdthesis
    BibTeX:
    			
                            @phdthesis{Melchior-2020,
                              author       = {Jan Melchior},
                              title        = {On the importance of centering in artificial neural networks},
                              school       = {Ruhr-Universit{\"a}t Bochum, Universit{\"a}tsbibliothek},
                              year         = {2020},
    			  
    			  url          = {https://hss-opus.ub.ruhr-uni-bochum.de/opus4/frontdoor/index/index/start/2/rows/10/sortfield/score/sortorder/desc/searchtype/simple/query/Jan+Melchior/docId/7713},
    			  
                              doi          = {http://doi.org/10.13154/294-7713}
                            }
    			
    					
    Melchior, J.; Bayati, M.; Azizi, A.; Cheng, S. & Wiskott, L. 2019 A Hippocampus Model for Online One-Shot Storage of Pattern Sequences CoRR e-print arXiv:1905.12937 .
     
    misc
    BibTeX:
    			
                            @misc{MelchiorBayatiEtAl-2019,
                              author       = {Melchior, Jan and Bayati, Mehdi and Azizi, Amir and Cheng, Sen and Wiskott, Laurenz},
                              title        = {A Hippocampus Model for Online One-Shot Storage of Pattern Sequences},
                              journal      = {CoRR},
                              year         = {2019},
                              howpublished = {e-print arXiv:1905.12937},
    			  
    			  url          = {https://arxiv.org/abs/1905.12937}
                            }
    			
    					
    Melchior, J.; Fischer, A.; Wang, N. & Wiskott, L. 2013 How to center binary Restricted Boltzmann Machines CoRR e-print arXiv:1311.1354 .
    ((latest version 2015-07-16 v3))  
    misc RBM: Centering (2013-2015)
    BibTeX:
    			
                            @misc{MelchiorFischerEtAl-2013,
                              author       = {Jan Melchior and Asja Fischer and Nan Wang and Laurenz Wiskott},
                              title        = {How to center binary {R}estricted {B}oltzmann {M}achines},
                              journal      = {CoRR},
                              year         = {2013},
                              howpublished = {e-print arXiv:1311.1354},
    			  
    			  url          = {https://arxiv.org/abs/1311.1354}
                            }
    			
    					
    Melchior, J.; Fischer, A. & Wiskott, L. 2016 How to center Deep Boltzmann Machines Journal of Machine Learning Research , 17(99) , 1-61 .
     
    article RBM: Centering (2013-2015)
    Abstract: This work analyzes centered Restricted Boltzmann Machines (RBMs) and centered Deep Boltzmann Machines (DBMs), where centering is done by subtracting offset values from visible and hidden variables. We show analytically that (i) centered and normal Boltzmann Machines (BMs) and thus RBMs and DBMs are different parameterizations of the same model class, such that any normal BM/RBM/DBM can be transformed to an equivalent centered BM/RBM/DBM and vice versa, and that this equivalence generalizes to artificial neural networks in general, (ii) the expected performance of centered binary BMs/RBMs/DBMs is invariant under simultaneous flip of data and offsets, for any offset value in the range of zero to one, (iii) centering can be reformulated as a different update rule for normal BMs/RBMs/DBMs, and (iv) using the enhanced gradient is equivalent to setting the offset values to the average over model and data mean. Furthermore, we present numerical simulations suggesting that (i) optimal generative performance is achieved by subtracting mean values from visible as well as hidden variables, (ii) centered binary RBMs/DBMs reach significantly higher log-likelihood values than normal binary RBMs/DBMs, (iii) centering variants whose offsets depend on the model mean, like the enhanced gradient, suffer from severe divergence problems, (iv) learning is stabilized if an exponentially moving average over the batch means is used for the offset values instead of the current batch mean, which also prevents the enhanced gradient from severe divergence, (v) on a similar level of log-likelihood values centered binary RBMs/DBMs have smaller weights and bigger bias parameters than normal binary RBMs/DBMs, (vi) centering leads to an update direction that is closer to the natural gradient, which is extremely efficient for training as we show for small binary RBMs, (vii) centering eliminates the need for greedy layer-wise pre-training of DBMs, which often even deteriorates the results independently of whether centering is used or not, and (ix) centering is also beneficial for auto encoders.
    BibTeX:
    			
                            @article{MelchiorFischerEtAl-2016,
                              author       = {Jan Melchior and Asja Fischer and Laurenz Wiskott},
                              title        = {How to center {D}eep {B}oltzmann {M}achines},
                              journal      = {Journal of Machine Learning Research},
                              year         = {2016},
                              volume       = {17},
                              number       = {99},
                              pages        = {1--61},
    			  
    			  url          = {http://jmlr.org/papers/v17/14-237.html}
                            }
    			
    					
    Melchior, J.; Wang, N. & Wiskott, L. 2017 Gaussian-binary restricted Boltzmann machines for modeling natural image statistics PLoS One , 12(2) , e0171015 .
     
    article RBM: Modeling natural images (2012-2014)
    Abstract: We present a theoretical analysis of Gaussian-binary restricted Boltzmann machines (GRBMs) from the perspective of density models. The key aspect of this analysis is to show that GRBMs can be formulated as a constrained mixture of Gaussians, which gives a much better insight into the model’s capabilities and limitations. We further show that GRBMs are capable of learning meaningful features without using a regularization term and that the results are comparable to those of independent component analysis. This is illustrated for both a two-dimensional blind source separation task and for modeling natural image patches. Our findings exemplify that reported difficulties in training GRBMs are due to the failure of the training algorithm rather than the model itself. Based on our analysis we derive a better training setup and show empirically that it leads to faster and more robust training of GRBMs. Finally, we compare different sampling algorithms for training GRBMs and show that Contrastive Divergence performs better than training methods that use a persistent Markov chain.
    BibTeX:
    			
                            @article{MelchiorWangEtAl-2017,
                              author       = {Melchior, J. and Wang, N. and Wiskott, L.},
                              title        = {Gaussian-binary restricted {B}oltzmann machines for modeling natural image statistics},
                              journal      = {PLoS One},
                              year         = {2017},
                              volume       = {12},
                              number       = {2},
                              pages        = {e0171015},
    			  
    			  url          = {http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0171015},
    			  
                              doi          = {http://doi.org/10.1371/journal.pone.0171015}
                            }
    			
    					
    Melchior, J. & Wiskott, L. 2019 Hebbian-Descent CoRR e-print arXiv:1905.10585 .
     
    misc
    BibTeX:
    			
                            @misc{MelchiorWiskott-2019,
                              author       = {Jan Melchior and Laurenz Wiskott},
                              title        = {Hebbian-Descent},
                              journal      = {CoRR},
                              year         = {2019},
                              howpublished = {e-print arXiv:1905.10585},
    			  
    			  url          = {https://arxiv.org/abs/1905.10585}
                            }
    			
    					
    Menne, M.; Sch�ler, M. & Wiskott, L. 2021 Exploring Slow Feature Analysis for Extracting Generative Latent Factors Proceedings of the 10th International Conference on Pattern Recognition Applications and Methods .
     
    inproceedings
    BibTeX:
    			
                            @inproceedings{MenneSchuelerWiskott2021,
                              author       = {Menne, Max and Sch�ler, Merlin and Wiskott, Laurenz},
                              title        = {Exploring Slow Feature Analysis for Extracting Generative Latent Factors},
                              booktitle    = {Proceedings of the 10th International Conference on Pattern Recognition Applications and Methods},
                              publisher    = {SCITEPRESS - Science and Technology Publications},
                              year         = {2021},
    			  
                              doi          = {http://doi.org/10.5220/0010391401200131}
                            }
    			
    					
    Neher, T. 2015 Analysis of the formation of memory and place cells in the hippocampus: a computational approach International Graduate School of Neuroscience (IGSN), Ruhr-Universität-Bochum , International Graduate School of Neuroscience (IGSN), Ruhr-Universität-Bochum , 153 .
     
    phdthesis Hippocampal memory model (2010-2014)
    BibTeX:
    			
                            @phdthesis{Neher-2015,
                              author       = {Neher, Torsten},
                              title        = {Analysis of the formation of memory and place cells in the hippocampus: a computational approach},
                              school       = {International Graduate School of Neuroscience (IGSN), Ruhr-Universit\"{a}t-Bochum},
                              year         = {2015},
                              pages        = {153},
    			  
    			  url          = {http://hss-opus.ub.ruhr-uni-bochum.de/opus4/frontdoor/index/index/docId/4739},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/Neher-2015-PhDThesis.pdf}
                            }
    			
    					
    Neher, T.; Cheng, S. & Wiskott, L. 2013 Are memories really stored in the hippocampal CA3 region? Proc. 10th Göttinger Meeting of the German Neuroscience Society, Mar 13-16, Göttingen, Germany , 104 .
     
    inproceedings Hippocampal memory model (2010-2014)
    BibTeX:
    			
                            @inproceedings{NeherChengEtAl-2013a,
                              author       = {Neher, Torsten and Cheng, Sen and Wiskott, Laurenz},
                              title        = {Are memories really stored in the hippocampal {CA3} region?},
                              booktitle    = {Proc.\ 10th Göttinger Meeting of the German Neuroscience Society, Mar 13-16, Göttingen, Germany},
                              year         = {2013},
                              pages        = {104}
                            }
    			
    					
    Neher, T.; Cheng, S. & Wiskott, L. 2013 Are memories really stored in the hippocampal CA3 region? BoNeuroMed Ruhr-Universität Bochum , Ruhr-Universität Bochum , 1 , 38-41 .
    (BoNeuroMed)  
    techreport Hippocampal memory model (2010-2014)
    BibTeX:
    			
                            @techreport{NeherChengEtAl-2013b,
                              author       = {Neher, Torsten and Cheng, Sen and Wiskott, Laurenz},
                              title        = {Are memories really stored in the hippocampal {CA3} region?},
                              journal      = {BoNeuroMed},
                              school       = {Ruhr-Universität Bochum},
                              year         = {2013},
                              volume       = {1},
                              pages        = {38--41}
                            }
    			
    					
    Neher, T.; Cheng, S. & Wiskott, L. 2015 Memory storage fidelity in the hippocampal circuit: the role of subregions and input statistics PLoS Comput Biol , 11 , e1004250 .
     
    article Hippocampal memory model (2010-2014)
    Abstract: In the last decades a standard model regarding the function of the hippocampus in memory formation has been established and tested computationally. It has been argued that the CA3 region works as an auto-associative memory and that its recurrent fibers are the actual storing place of the memories. Furthermore, to work properly CA3 requires memory patterns that are mutually uncorrelated. It has been suggested that the dentate gyrus orthogonalizes the patterns before storage, a process known as pattern separation. In this study we review the model when random input patterns are presented for storage and investigate whether it is capable of storing patterns of more realistic entorhinal grid cell input. Surprisingly, we find that an auto-associative CA3 net is redundant for random inputs up to moderate noise levels and is only beneficial at high noise levels. When grid cell input is presented, auto-association is even harmful for memory performance at all levels. Furthermore, we find that Hebbian learning in the dentate gyrus does not support its function as a pattern separator. These findings challenge the standard framework and support an alternative view where the simpler EC-CA1-EC network is sufficient for memory storage.
    BibTeX:
    			
                            @article{NeherChengEtAl-2015,
                              author       = {T. Neher and S. Cheng and L. Wiskott},
                              title        = {Memory storage fidelity in the hippocampal circuit: the role of subregions and input statistics},
                              journal      = {PLoS Comput Biol},
                              year         = {2015},
                              volume       = {11},
                              pages        = {e1004250},
    			  
    			  url          = {http://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1004250},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/NeherChengWiskott-2015-PLoSCompBiol-HippocampalMemoryModel.pdf},
    			  
                              doi          = {http://doi.org/10.1371/journal.pcbi.1004250}
                            }
    			
    					
    Parra-Barrero, E.; Vijayabaskaran, S.; Seabrook, E.; Wiskott, L. & Cheng, S. 2023 A Map of Spatial Navigation for Neuroscience .
     
    misc
    Abstract: An animal's ability to navigate space is crucial to its survival. It is also cognitively demanding, and relatively easy to probe. For these reasons, spatial navigation has received a great deal of
    attention from neuroscientists, leading to the identification of key brain areas and the ongoing discovery of a ``zoo'' of cell types responding to different aspects of spatial tasks. Despite this progress, our
    understanding of how the pieces fit together to drive behavior is generally lacking. We argue that this is partly caused by insufficient communication between researchers focusing on spatial behavior and those
    attempting to study its neural basis. This has led the latter to under-appreciate the relevance and complexity of spatial behavior, and to focus too narrowly on characterizing neural representations of
    spacetextemdash disconnected from the computations these representations are meant to enable. We therefore propose a taxonomy of navigation processes in mammals that can serve as a common framework for
    structuring and facilitating interdisciplinary research in the field. Using the taxonomy as a guide, we review behavioral and neural studies of spatial navigation. In doing so, we both validate the taxonomy and
    showcase its usefulness in identifying potential issues with common experimental approaches, designing experiments that adequately target particular behaviors, correctly interpreting neural activity, and
    pointing to new avenues of research.
    BibTeX:
    			
                            @misc{Parra-BarreroVijayabaskaranEtAl-2023a,
                              author       = {{Parra-Barrero}, Eloy and Vijayabaskaran, Sandhiya and Seabrook, Eddie and Wiskott, Laurenz and Cheng, Sen},
                              title        = {A Map of Spatial Navigation for Neuroscience},
                              publisher    = {OSF Preprints},
                              year         = {2023},
    			  
                              doi          = {http://doi.org/10.31219/osf.io/a86gq}
                            }
    			
    					
    Parra-Barrero, E.; Vijayabaskaran, S.; Seabrook, E.; Wiskott, L. & Cheng, S. 2023 A map of spatial navigation for neuroscience Neuroscience & Biobehavioral Reviews , 152 , 105200 .
     
    article
    Abstract: Spatial navigation has received much attention from neuroscientists, leading to the identification of key brain areas and the discovery of numerous spatially selective cells. Despite this progress, our understanding of how the pieces fit together to drive behavior is generally lacking. We argue that this is partly caused by insufficient communication between behavioral and neuroscientific researchers. This has led the latter to under-appreciate the relevance and complexity of spatial behavior, and to focus too narrowly on characterizing neural representations of space—disconnected from the computations these representations are meant to enable. We therefore propose a taxonomy of navigation processes in mammals that can serve as a common framework for structuring and facilitating interdisciplinary research in the field. Using the taxonomy as a guide, we review behavioral and neural studies of spatial navigation. In doing so, we validate the taxonomy and showcase its usefulness in identifying potential issues with common experimental approaches, designing experiments that adequately target particular behaviors, correctly interpreting neural activity, and pointing to new avenues of research.
    BibTeX:
    			
                            @article{Parra-BarreroVijayabaskaranEtAl-2023b,
                              author       = {Eloy Parra-Barrero and Sandhiya Vijayabaskaran and Eddie Seabrook and Laurenz Wiskott and Sen Cheng},
                              title        = {A map of spatial navigation for neuroscience},
                              journal      = {Neuroscience \& Biobehavioral Reviews},
                              year         = {2023},
                              volume       = {152},
                              pages        = {105200},
    			  
    			  url          = {https://www.sciencedirect.com/science/article/pii/S0149763423001690},
    			  
                              doi          = {http://doi.org/10.1016/j.neubiorev.2023.105200}
                            }
    			
    					
    Pillonetto, G.; Ha Quang, M. & Chiuso, A. 2011 A new kernel-based approach for nonlinear system identification IEEE Trans. on Automatic Control , 56(12) .
     
    article N.N.
    BibTeX:
    			
                            @article{PillonettoHaQuangEtAl-2011,
                              author       = {Gianluigi Pillonetto and Ha Quang, Minh and Alessandro Chiuso},
                              title        = {A new kernel-based approach for nonlinear system identification},
                              journal      = {IEEE Trans.\ on Automatic Control},
                              year         = {2011},
                              volume       = {56},
                              number       = {12},
    			  
    			  url          = {http://ieeexplore.ieee.org/document/5738321/},
    			  
                              doi          = {http://doi.org/10.1109/tac.2011.2131830}
                            }
    			
    					
    Pötzsch, M.; Maurer, T.; Wiskott, L. & von der Malsburg, C. 1996 Reconstruction from graphs labeled with responses of Gabor filters Proc. Intl. Conf. on Artificial Neural Networks (ICANN'96), Bochum, Germany , 845-850 .
     
    inproceedings Reconstruction from Gabor wavelets (1993-1995)
    Abstract: The work presented is part of a larger effort to build a general object recognition system. Objects as well as human faces are represented by graphs labeled with Gabor filter responses. We describe an optimal method to reconstruct images from such graphs. Two examples of how this can be used to analyze the object representation or to compensate for its deficiencies are presented. Since the reconstruction method is formulated generally for an arbitray set of linear filters, it can also be applied to data produced by other systems, artificial or biological.
    BibTeX:
    			
                            @inproceedings{PoetzschMaurerEtAl-1996,
                              author       = {Michael Pötzsch and Thomas Maurer and Laurenz Wiskott and Christoph von der Malsburg},
                              title        = {Reconstruction from graphs labeled with responses of {G}abor filters},
                              booktitle    = {Proc.\ Intl.\ Conf.\ on Artificial Neural Networks (ICANN'96), Bochum, Germany},
                              publisher    = {Springer-Verlag},
                              year         = {1996},
                              pages        = {845--850},
    			  
    			  url          = {https://link.springer.com/chapter/10.1007/3-540-61510-5_142},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/PotzschMaurerEtAl-1996-ProcICANN-Jets-Preprint.pdf},
    			  
                              doi          = {http://doi.org/10.1007/3-540-61510-5_142}
                            }
    			
    					
    Qaadan, S.; Schüler, M. & Glasmachers, T. 2019 Dual SVM Training on a Budget Proceedings of the 8th International Conference on Pattern Recognition Applications and Methods .
     
    inproceedings
    BibTeX:
    			
                            @inproceedings{QaadanSchuelerEtAl-2019,
                              author       = {Qaadan, Sahar and Schüler, Merlin and Glasmachers, Tobias},
                              title        = {Dual SVM Training on a Budget},
                              booktitle    = {Proceedings of the 8th International Conference on Pattern Recognition Applications and Methods},
                              publisher    = {SCITEPRESS - Science and Technology Publications},
                              year         = {2019}
                            }
    			
    					
    Rasch, M. 2003 Modellierung adulter Neurogenese im Hippocampus [Modeling adult neurogenesis in the hippocampus] Diploma thesis, Institute for Biology , Humboldt University Berlin, D-10115 Berlin, Germany .
     
    mastersthesis Adult neurogenesis: Function I (2000-2003)
    BibTeX:
    			
                            @mastersthesis{Rasch-2003,
                              author       = {Malte Rasch},
                              title        = {Modellierung adulter {N}eurogenese im {H}ippocampus [{M}odeling adult neurogenesis in the hippocampus]},
                              school       = {Institute for Biology},
                              year         = {2003},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/Rasch-2003-DiplThesis-Neurogenesis.pdf}
                            }
    			
    					
    Rath-Manakidis, P. 2021 Interaction of Ensembling and Double Descent in Deep Neural Networks Cognitive Science, Ruhr University Bochum, Germany , Cognitive Science, Ruhr University Bochum, Germany .
     
    mastersthesis
    BibTeX:
    			
                            @mastersthesis{Rath-Manakidis-2021,
                              author       = {Rath-Manakidis, Pavlos},
                              title        = {Interaction of Ensembling and Double Descent in Deep Neural Networks},
                              school       = {Cognitive Science, Ruhr University Bochum, Germany},
                              year         = {2021},
    			  
    			  url          = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/RathManakidis-2021-MScThesis-DoubleDescent.pdf}
                            }
    			
    					
    Rath-Manakidis, P.; Hlynsson, H.D. & Wiskott, L. 2022 Reduction of Variance-related Error through Ensembling: Deep Double Descent and Out-of-Distribution Generalization Proc. 11th International Conference on Pattern Recognition Applications and Methods (ICPRAM) , 31–40 .
     
    inproceedings
    BibTeX:
    			
                            @inproceedings{Rath-ManakidisHlynssonEtAl-2022,
                              author       = {Rath-Manakidis, Pavlos and Hlynsson, Hlynur Davíð and Wiskott, Laurenz},
                              title        = {Reduction of Variance-related Error through Ensembling: Deep Double Descent and Out-of-Distribution Generalization},
                              booktitle    = {Proc. 11th International Conference on Pattern Recognition Applications and Methods (ICPRAM)},
                              publisher    = {SciTePress},
                              year         = {2022},
                              pages        = {31–40},
    			  
                              doi          = {http://doi.org/10.5220/0010821300003122}
                            }
    			
    					
    Reyhanian, S.; Fayyaz, Z. & Wiskott, L. 2023 Hierarchical Transformer VQ-VAE: An investigation of attentional selection in a generative model of episodic memory Proc. Bernstein Conference, Sep 26-29, Berlin, Germany .
     
    inproceedings
    BibTeX:
    			
                            @inproceedings{ReyhanianFayyazEtAl-2023,
                              author       = {Reyhanian, Shirin and Fayyaz, Zahra and Wiskott, Laurenz},
                              title        = {Hierarchical Transformer VQ-VAE: An investigation of attentional selection in a generative model of episodic memory},
                              booktitle    = {Proc. Bernstein Conference, Sep 26-29, Berlin, Germany},
                              year         = {2023},
    			  
                              doi          = {http://doi.org/10.12751/nncn.bc2023.333}
                            }
    			
    					
    Richthofer; Weghenkel & Wiskott 2012 Predictable Feature Analysis Proc. Bernstein Conference on Computational Neuroscience, Sep 12-14, Munich, Germany .
    (Special issue of Frontiers in Computational Neuroscience 120)  
    inproceedings Predictable Feature Analysis (2010-now)
    BibTeX:
    			
                            @inproceedings{RichthoferWeghenkelEtAl-2012,
                              author       = {Richthofer and Weghenkel and Wiskott},
                              title        = {Predictable {F}eature {A}nalysis},
                              booktitle    = {Proc.\ Bernstein Conference on Computational Neuroscience, Sep 12--14, Munich, Germany},
                              year         = {2012}
                            }
    			
    					
    Richthofer, S. & Wiskott, L. 2013 Predictable Feature Analysis CoRR e-print arXiv:1311.2503 .
     
    misc Predictable Feature Analysis (2010-now)
    Abstract: Every organism in an environment, whether biological, robotic or virtual, must be able to predict certain aspects of its environment in order to survive or perform whatever task is intended. It needs a model that is capable of estimating the consequences of possible actions, so that planning, control, and decision-making become feasible. For scientific purposes, such models are usually created in a problem specific manner using differential equations and other techniques from control- and system-theory. In contrast to that, we aim for an unsupervised approach that builds up the desired model in a self-organized fashion. Inspired by Slow Feature Analysis (SFA), our approach is to extract sub-signals from the input, that behave as predictable as possible. These 'predictable features' are highly relevant for modeling, because predictability is a desired property of the needed consequence-estimating model by definition. In our approach, we measure predictability with respect to a certain prediction model. We focus here on the solution of the arising optimization problem and present a tractable algorithm based on algebraic methods which we call Predictable Feature Analysis (PFA). We prove that the algorithm finds the globally optimal signal, if this signal can be predicted with low error. To deal with cases where the optimal signal has a significant prediction error, we provide a robust, heuristically motivated variant of the algorithm and verify it empirically. Additionally, we give formal criteria a prediction-model must meet to be suitable for measuring predictability in the PFA setting and also provide a suitable default-model along with a formal proof that it meets these criteria.
    BibTeX:
    			
                            @misc{RichthoferWiskott-2013,
                              author       = {Stefan Richthofer and Laurenz Wiskott},
                              title        = {Predictable {F}eature {A}nalysis},
                              journal      = {CoRR},
                              year         = {2013},
                              howpublished = {e-print arXiv:1311.2503},
    			  
    			  url          = {https://arxiv.org/abs/1311.2503}
                            }
    			
    					
    Richthofer, S. & Wiskott, L. 2015 Predictable Feature Analysis IEEE 14th International Conference on Machine Learning and Applications (ICMLA) , 190-196 .
     
    inproceedings Predictable Feature Analysis (2010-now)
    Abstract: Every organism in an environment, whether biological, robotic or virtual, must be able to predict certain aspects of its environment in order to survive or perform whatever task is intended. It needs a model that is capable of estimating the consequences of possible actions, so that planning, control, and decision-making become feasible. For scientific purposes, such models are usually created in a problem specific manner using differential equations and other techniques from control-and system-theory. In contrast to that, we aim for an unsupervised approach that builds up the desired model in a self-organized fashion. Inspired by Slow Feature Analysis (SFA), our approach is to extract subsignals from the input, that behave as predictable as possible. These 'predictable features' are highly relevant for modeling, because predictability is a desired property of the needed consequence-estimating model by definition. In our approach, we measure predictability with respect to a certain prediction model. We focus here on the solution of the arising optimization problem and present a tractable algorithm based on algebraic methods which we call Predictable Feature Analysis (PFA). We prove that the algorithm finds the globally optimal signal if this signal can be predicted with low error. To deal with cases where the optimal signal has a significant prediction error, we provide a robust, heuristically motivated variant of the algorithm and verify it empirically. Additionally, we give formal criteria a prediction model must meet to be suitable for measuring predictability in the PFA setting and also provide a suitable default model along with a formal proof that it meets these criteria.
    BibTeX:
    			
                            @inproceedings{RichthoferWiskott-2015a,
                              author       = {S. Richthofer and L. Wiskott},
                              title        = {Predictable {F}eature {A}nalysis},
                              booktitle    = {IEEE 14th International Conference on Machine Learning and Applications (ICMLA)},
                              year         = {2015},
                              pages        = {190--196},
    			  
    			  url          = {http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=7424307&filter=AND(p_IS_Number:7424247)&pageNumber=2},
    			  
                              doi          = {http://doi.org/10.1109/ICMLA.2015.158}
                            }
    			
    					
    Richthofer, S. & Wiskott, L. 2015 Predictable Feature Analysis Workshop New Challenges in Neural Computation 2015 (NC2) , 68-75 .
     
    inproceedings Predictable Feature Analysis (2010-now)
    BibTeX:
    			
                            @inproceedings{RichthoferWiskott-2015b,
                              author       = {S. Richthofer and L. Wiskott},
                              title        = {Predictable {F}eature {A}nalysis},
                              booktitle    = {Workshop New Challenges in Neural Computation 2015 (NC2)},
                              year         = {2015},
                              pages        = {68--75},
    			  
    			  url          = {https://www.techfak.uni-bielefeld.de/~fschleif/mlr/mlr_03_2015.pdf}
                            }
    			
    					
    Richthofer, S. & Wiskott, L. 2017 PFAx: Predictable Feature Analysis to Perform Control CoRR e-print arXiv:1712.00634 , abs/1712.00634 .
     
    misc
    Abstract: Predictable Feature Analysis (PFA) (Richthofer, Wiskott, ICMLA 2015) is an algorithm that performs dimensionality reduction on high dimensional input signal. It extracts those subsignals that are most predictable according to a certain prediction model. We refer to these extracted signals as predictable features. In this work we extend the notion of PFA to take supplementary information into account for improving its predictions. Such information can be a multidimensional signal like the main input to PFA, but is regarded external. That means it won't participate in the feature extraction - no features get extracted or composed of it. Features will be exclusively extracted from the main input such that they are most predictable based on themselves and the supplementary information. We refer to this enhanced PFA as PFAx (PFA extended). Even more important than improving prediction quality is to observe the effect of supplementary information on feature selection. PFAx transparently provides insight how the supplementary information adds to prediction quality and whether it is valuable at all. Finally we show how to invert that relation and can generate the supplementary information such that it would yield a certain desired outcome of the main signal. We apply this to a setting inspired by reinforcement learning and let the algorithm learn how to control an agent in an environment. With this method it is feasible to locally optimize the agent's state, i.e. reach a certain goal that is near enough. We are preparing a follow-up paper that extends this method such that also global optimization is feasible.
    BibTeX:
    			
                            @misc{RichthoferWiskott-2017,
                              author       = {Stefan Richthofer and Laurenz Wiskott},
                              title        = {{PFAx}: {P}redictable {F}eature {A}nalysis to {P}erform {C}ontrol},
                              journal      = {CoRR},
                              year         = {2017},
                              volume       = {abs/1712.00634},
                              howpublished = {e-print arXiv:1712.00634},
    			  
    			  url          = {https://arxiv.org/abs/1712.00634}
                            }
    			
    					
    Richthofer, S. & Wiskott, L. 2018 Global Navigation Using Predictable and Slow Feature Analysis in Multiroom Environments, Path Planning and Other Control Tasks CoRR e-print arXiv:1805.08565 , abs/1805.08565 .
     
    misc
    Abstract: Extended Predictable Feature Analysis (PFAx) [Richthofer and Wiskott, 2017] is an extension of PFA [Richthofer and Wiskott, 2015] that allows generating a goal-directed control signal of an agent whose dynamics has previously been learned during a training phase in an unsupervised manner. PFAx hardly requires assumptions or prior knowledge of the agent's sensor or control mechanics, or of the environment. It selects features from a high-dimensional input by intrinsic predictability and organizes them into a reasonably low-dimensional model. While PFA obtains a well predictable model, PFAx yields a model ideally suited for manipulations with predictable outcome. This allows for goal-directed manipulation of an agent and thus for local navigation, i.e. for reaching states where intermediate actions can be chosen by a permanent descent of distance to the goal. The approach is limited when it comes to global navigation, e.g. involving obstacles or multiple rooms. In this article, we extend theoretical results from [Sprekeler and Wiskott, 2008], enabling PFAx to perform stable global navigation. So far, the most widely exploited characteristic of Slow Feature Analysis (SFA) was that slowness yields invariances. We focus on another fundamental characteristics of slow signals: They tend to yield monotonicity and one significant property of monotonicity is that local optimization is sufficient to find a global optimum. We present an SFA-based algorithm that structures an environment such that navigation tasks hierarchically decompose into subgoals. Each of these can be efficiently achieved by PFAx, yielding an overall global solution of the task. The algorithm needs to explore and process an environment only once and can then perform all sorts of navigation tasks efficiently. We support this algorithm by mathematical theory and apply it to different problems.
    BibTeX:
    			
                            @misc{RichthoferWiskott-2018,
                              author       = {Stefan Richthofer and Laurenz Wiskott},
                              title        = {{G}lobal {N}avigation {U}sing {P}redictable and {S}low {F}eature {A}nalysis in {M}ultiroom {E}nvironments, {P}ath {P}lanning and {O}ther {C}ontrol {T}asks},
                              journal      = {CoRR},
                              year         = {2018},
                              volume       = {abs/1805.08565},
                              howpublished = {e-print arXiv:1805.08565},
    			  
    			  url          = {https://arxiv.org/abs/1805.08565}
                            }
    			
    					
    Richthofer, S. & Wiskott, L. 2020 Singular Sturm-Liouville Problems with Zero Potential (q=0) and Singular Slow Feature Analysis e-print arXiv:2011.04765 .
     
    misc
    BibTeX:
    			
                            @misc{RichthoferWiskott-2020,
                              author       = {Richthofer, Stefan and Wiskott, Laurenz},
                              title        = {Singular Sturm-Liouville Problems with Zero Potential (q=0) and Singular Slow Feature Analysis},
                              year         = {2020},
                              howpublished = {e-print arXiv:2011.04765}
                            }
    			
    					
    Schiewer, R. & Wiskott, L. 2022 Modular Networks Prevent Catastrophic Interference in Model-Based Multi-task Reinforcement Learning Proc. 8th International Conference on Machine Learning, Optimization, and Data Science (LOD) , Lecture Notes in Computer Science , 13164 , 299–313 .
     
    inproceedings
    Abstract: In a multi-task reinforcement learning setting, the learner commonly benefits from training on multiple related tasks by exploiting similarities among them. At the same time, the trained agent is able to solve a wider range of different problems. While this effect is well documented for model-free multi-task methods, we demonstrate a detrimental effect when using a single learned dynamics model for multiple tasks. Thus, we address the fundamental question of whether model-based multi-task reinforcement learning benefits from shared dynamics models in a similar way model-free methods do from shared policy networks. Using a single dynamics model, we see clear evidence of task confusion and reduced performance. As a remedy, enforcing an internal structure for the learned dynamics model by training isolated sub-networks for each task notably improves performance while using the same amount of parameters. We illustrate our findings by comparing both methods on a simple gridworld and a more complex vizdoom multi-task experiment.
    BibTeX:
    			
                            @inproceedings{SchiewerWiskott-2022,
                              author       = {Schiewer, Robin and Wiskott, Laurenz},
                              title        = {Modular Networks Prevent Catastrophic Interference in Model-Based Multi-task Reinforcement Learning},
                              booktitle    = {Proc. 8th International Conference on Machine Learning, Optimization, and Data Science (LOD)},
                              publisher    = {Springer International Publishing},
                              year         = {2022},
                              volume       = {13164},
                              pages        = {299–313},
    			  
                              doi          = {http://doi.org/10.1007/978-3-030-95470-3_23}
                            }
    			
    					
    Schiewer, R. & Wiskott, L. 2021 Modular Networks Prevent Catastrophic Interference in Model-Based Multi-Task Reinforcement Learning .
     
    misc
    BibTeX:
    			
                            @misc{SchiewerWiskott-2021,
                              author       = {Robin Schiewer and Laurenz Wiskott},
                              title        = {Modular Networks Prevent Catastrophic Interference in Model-Based Multi-Task Reinforcement Learning},
                              year         = {2021},
    			  
                              doi          = {http://doi.org/10.48550/arXiv.2111.08010}
                            }
    			
    					
    Schönfeld, F. 2015 A computational model of spatial encoding in the hippocampus International Graduate School of Neuroscience, Ruhr-Universität Bochum , International Graduate School of Neuroscience, Ruhr-Universität Bochum .
     
    phdthesis
    BibTeX:
    			
                            @phdthesis{Schoenfeld-2015,
                              author       = {Fabian Sch\"onfeld},
                              title        = {A computational model of spatial encoding in the hippocampus},
                              school       = {International Graduate School of Neuroscience, Ruhr-Universität Bochum},
                              year         = {2015},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/Schoenfeld-2015-PhDThesis.pdf}
                            }
    			
    					
    Schönfeld, F. & Wiskott, L. 2013 RatLab: an easy to use tool for place code simulations Frontiers in Computational Neuroscience , 104(7) .
     
    article SFA: Place cells II (2010-2015)
    Abstract: In this paper we present the RatLab toolkit, a software framework designed to set up and simulate a wide range of studies targeting the encoding of space in rats. It provides open access to our modeling approach to establish place and head direction cells within unknown environments and it offers a set of parameters to allow for the easy construction of a variety of enclosures for a virtual rat as well as controlling its movement pattern over the course of experiments. Once a spatial code is formed RatLab can be used to modify aspects of the enclosure or movement pattern and plot the effect of such modifications on the spatial representation, i.e., place and head direction cell activity. The simulation is based on a hierarchical Slow Feature Analysis (SFA) network that has been shown before to establish a spatial encoding of new environments using visual input data only. RatLab encapsulates such a network, generates the visual training data, and performs all sampling automatically—with each of these stages being further configurable by the user. RatLab was written with the intention to make our SFA model more accessible to the community and to that end features a range of elements to allow for experimentation with the model without the need for specific programming skills.
    BibTeX:
    			
                            @article{SchoenfeldWiskott-2013a,
                              author       = {F. Schönfeld and L. Wiskott},
                              title        = {Rat{L}ab: an easy to use tool for place code simulations},
                              journal      = {Frontiers in Computational Neuroscience},
                              year         = {2013},
                              volume       = {104},
                              number       = {7},
    			  
    			  url          = {http://journal.frontiersin.org/article/10.3389/fncom.2013.00104/full},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/SchoenfeldWiskott-2013-FrontiersCompNeurosci-RatLab.pdf},
    			  
                              doi          = {http://doi.org/10.3389/fncom.2013.00104}
                            }
    			
    					
    Schönfeld, F. & Wiskott, L. 2013 Theoretical neuroscience: finding your way into the light IGSN report International Graduate School of Neuroscience, Ruhr-Universität Bochum , International Graduate School of Neuroscience, Ruhr-Universität Bochum , 47-49 .
     
    techreport SFA: Place cells II (2010-2015)
    BibTeX:
    			
                            @techreport{SchoenfeldWiskott-2013b,
                              author       = {Fabian Schönfeld and Laurenz Wiskott},
                              title        = {Theoretical neuroscience: finding your way into the light},
                              journal      = {IGSN report},
                              school       = {International Graduate School of Neuroscience, Ruhr-Universität Bochum},
                              year         = {2013},
                              pages        = {47--49}
                            }
    			
    					
    Schönfeld, F. & Wiskott, L. 2015 Modeling place field activity with hierarchical Slow Feature Analysis Frontiers in Computational Neuroscience , 9 , 51 .
     
    article SFA: Place cells II (2010-2015)
    Abstract: What are the computational laws of hippocampal activity? In this paper we argue for the slowness principle as a fundamental processing paradigm behind hippocampal place cell firing. We present six different studies from the experimental literature, performed with real-life rats, that we replicated in computer simulations. Each of the chosen studies allows rodents to develop stable place fields and then examines a distinct property of the established spatial encoding: adaptation to cue relocation and removal; directional dependent firing in the linear track and open field; and morphing and scaling the environment itself. Simulations are based on a hierarchical Slow Feature Analysis (SFA) network topped by a principal component analysis (ICA) output layer. The slowness principle is shown to account for the main findings of the presented experimental studies. The SFA network generates its responses using raw visual input only, which adds to its biological plausibility but requires experiments performed in light conditions. Future iterations of the model will thus have to incorporate additional information, such as path integration and grid cell activity, in order to be able to also replicate studies that take place during darkness.
    BibTeX:
    			
                            @article{SchoenfeldWiskott-2015,
                              author       = {Fabian Schönfeld and Laurenz Wiskott},
                              title        = {Modeling place field activity with hierarchical {S}low {F}eature {A}nalysis},
                              journal      = {Frontiers in Computational Neuroscience},
                              year         = {2015},
                              volume       = {9},
                              pages        = {51},
    			  
    			  url          = {http://journal.frontiersin.org/article/10.3389/fncom.2015.00051/full},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/SchoenfeldWiskott-2015-FrontiersCompNeurosci-SFAPlaceFields.pdf},
    			  
                              doi          = {http://doi.org/10.3389/fncom.2015.00051}
                            }
    			
    					
    Schüler, M.; Hlynsson, H.D.; ið & Wiskott, L. 2019 Gradient-based Training of Slow Feature Analysis by Differentiable Approximate Whitening Proceedings of The Eleventh Asian Conference on Machine Learning , Proceedings of Machine Learning Research , 101 , 316–331 .
     
    inproceedings
    BibTeX:
    			
                            @inproceedings{SchuelerHlynssonEtAl-2019,
                              author       = {Schüler, Merlin and Hlynsson, Hlynur Dav\′ið and Wiskott, Laurenz},
                              title        = {Gradient-based Training of Slow Feature Analysis by Differentiable Approximate Whitening},
                              booktitle    = {Proceedings of The Eleventh Asian Conference on Machine Learning},
                              publisher    = {PMLR},
                              year         = {2019},
                              volume       = {101},
                              pages        = {316–331}
                            }
    			
    					
    Schüler, M.; Hlynsson, H.D. & Wiskott, L. 2018 Gradient-based Training of Slow Feature Analysis by Differentiable Approximate Whitening CoRR e-print arXiv:1808.08833 .
     
    misc
    BibTeX:
    			
                            @misc{SchuelerHlynssonEtAl-2018,
                              author       = {Schüler, Merlin and Hlynsson, Hlynur Davíð and Wiskott, Laurenz},
                              title        = {Gradient-based Training of Slow Feature Analysis by Differentiable Approximate Whitening},
                              journal      = {CoRR},
                              year         = {2018},
                              howpublished = {e-print arXiv:1808.08833},
    			  
    			  url          = {https://arxiv.org/abs/1808.08833}
                            }
    			
    					
    Seabrook, E. & Wiskott, L. 2022 A Tutorial on the Spectral Theory of Markov Chains .
     
    misc
    BibTeX:
    			
                            @misc{SeabrookWiskott-2022,
                              author       = {Eddie Seabrook and Laurenz Wiskott},
                              title        = {A Tutorial on the Spectral Theory of Markov Chains},
                              year         = {2022},
    			  
                              doi          = {http://doi.org/10.48550/arXiv.2207.02296}
                            }
    			
    					
    Seabrook, E. & Wiskott, L. 2023 A Tutorial on the Spectral Theory of Markov Chains Neural Computation , 35(11) , 1713–1796 .
     
    article
    BibTeX:
    			
                            @article{SeabrookWiskott-2023,
                              author       = {Seabrook, Eddie and Wiskott, Laurenz},
                              title        = {A Tutorial on the Spectral Theory of Markov Chains},
                              journal      = {Neural Computation},
                              year         = {2023},
                              volume       = {35},
                              number       = {11},
                              pages        = {1713–1796},
    			  
                              doi          = {http://doi.org/10.1162/neco_a_01611}
                            }
    			
    					
    Sprekeler, H. 2009 Slowness learning Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät I , Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät I .
     
    phdthesis SFA and STDP (2003-2006), SFA: Place cells I (2003-2007), SFA: Theory of complex cells (2004-2007), Extended slow feature analysis (xSFA) (2006-2013)
    BibTeX:
    			
                            @phdthesis{Sprekeler-2009,
                              author       = {Henning Sprekeler},
                              title        = {Slowness learning},
                              school       = {Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät I},
                              year         = {2009},
    			  
    			  url          = {http://edoc.hu-berlin.de/docviews/abstract.php?id=29695},
    			  
                              doi          = {http://doi.org/10.18452/15897}
                            }
    			
    					
    Sprekeler, H.; Michaelis, C. & Wiskott, L. 2006 Slowness: an objective for spike-timing dependent plasticity? Proc. 2nd Bernstein Symposium for Computational Neuroscience, Oct 1-3, Berlin, Germany , 24 .
     
    inproceedings SFA and STDP (2003-2006)
    BibTeX:
    			
                            @inproceedings{SprekelerMichaelisEtAl-2006a,
                              author       = {Henning Sprekeler and Christian Michaelis and Laurenz Wiskott},
                              title        = {Slowness: an objective for spike-timing dependent plasticity?},
                              booktitle    = {Proc.\ 2nd Bernstein Symposium for Computational Neuroscience, Oct 1--3, Berlin, Germany},
                              publisher    = {Bernstein Center for Computational Neuroscience (BCCN) Berlin},
                              year         = {2006},
                              pages        = {24}
                            }
    			
    					
    Sprekeler, H.; Michaelis, C. & Wiskott, L. 2006 Slowness: an objective for spike-timing-dependent plasticity? Cognitive Sciences EPrint Archive (CogPrints) , 5281 .
     
    misc SFA and STDP (2003-2006)
    BibTeX:
    			
                            @misc{SprekelerMichaelisEtAl-2006b,
                              author       = {Henning Sprekeler and Christian Michaelis and Laurenz Wiskott},
                              title        = {Slowness: an objective for spike-timing-dependent plasticity?},
                              year         = {2006},
                              volume       = {5281},
                              howpublished = {Cognitive Sciences EPrint Archive (CogPrints)},
    			  
    			  url          = {http://cogprints.org/5281/}
                            }
    			
    					
    Sprekeler, H.; Michaelis, C. & Wiskott, L. 2007 Slowness: an objective for spike timing-dependent plasticity? Proc. 7th Göttingen Meeting of the German Neuroscience Society, Mar 29 - Apr 1, Göttingen, Germany , T27-3A .
     
    inproceedings SFA and STDP (2003-2006)
    BibTeX:
    			
                            @inproceedings{SprekelerMichaelisEtAl-2007a,
                              author       = {Henning Sprekeler and Christian Michaelis and Laurenz Wiskott},
                              title        = {Slowness: an objective for spike timing-dependent plasticity?},
                              booktitle    = {Proc.\ 7th Göttingen Meeting of the German Neuroscience Society, Mar 29 -- Apr 1, Göttingen, Germany},
                              year         = {2007},
                              pages        = {T27--3A}
                            }
    			
    					
    Sprekeler, H.; Michaelis, C. & Wiskott, L. 2007 Slowness: an objective for spike-timing-dependent plasticity? PLoS Computational Biology , 3(6) , e112 .
     
    article SFA and STDP (2003-2006)
    Abstract: Our nervous system can efficiently recognize objects in spite of changes in contextual variables such as perspective or lighting conditions. Several lines of research have proposed that this ability for invariant recognition is learned by exploiting the fact that object identities typically vary more slowly in time than contextual variables or noise. Here, we study the question of how this 'temporal stability' or 'slowness' approach can be implemented within the limits of biologically realistic spike-based learning rules. We first show that slow feature analysis, an algorithm that is based on slowness, can be implemented in linear continuous model neurons by means of a modified Hebbian learning rule. This approach provides a link to the trace rule, which is another implementation of slowness learning. Then, we show analytically that for linear Poisson neurons, slowness learning can be implemented by spike-timing-dependent plasticity (STDP) with a specific learning window. By studying the learning dynamics of STDP, we show that for functional interpretations of STDP, it is not the learning window alone that is relevant but rather the convolution of the learning window with the postsynaptic potential. We then derive STDP learning windows that implement slow feature analysis and the 'trace rule.' The resulting learning windows are compatible with physiological data both in shape and timescale. Moreover, our analysis shows that the learning window can be split into two functionally different components that are sensitive to reversible and irreversible aspects of the input statistics, respectively. The theory indicates that irreversible input statistics are not in favor of stable weight distributions but may generate oscillatory weight dynamics. Our analysis offers a novel interpretation for the functional role of STDP in physiological neurons.
    BibTeX:
    			
                            @article{SprekelerMichaelisEtAl-2007b,
                              author       = {Henning Sprekeler and Christian Michaelis and Laurenz Wiskott},
                              title        = {Slowness: an objective for spike-timing--dependent plasticity?},
                              journal      = {PLoS Computational Biology},
                              year         = {2007},
                              volume       = {3},
                              number       = {6},
                              pages        = {e112},
    			  
    			  url          = {http://dx.doi.org/10.1371/journal.pcbi.0030112},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/SprekelerMichaelisEtAl-2007b-PLoSCompBiol-SFA-STDP.pdf},
    			  
                              doi          = {http://doi.org/10.1371/journal.pcbi.0030112}
                            }
    			
    					
    Sprekeler, H. & Wiskott, L. 2006 Analytical derivation of complex cell properties from the slowness principle Proc. Berlin Neuroscience Forum, Jun 8-10, Bad Liebenwalde, Germany , 65-66 .
     
    inproceedings SFA: Theory of complex cells (2004-2007)
    BibTeX:
    			
                            @inproceedings{SprekelerWiskott-2006a,
                              author       = {H. Sprekeler and L. Wiskott},
                              title        = {Analytical derivation of complex cell properties from the slowness principle},
                              booktitle    = {Proc.\ Berlin Neuroscience Forum, Jun 8--10, Bad Liebenwalde, Germany},
                              publisher    = {Max-Delbrück-Centrum für Molekulare Medizin (MDC)},
                              year         = {2006},
                              pages        = {65--66}
                            }
    			
    					
    Sprekeler, H. & Wiskott, L. 2006 Analytical derivation of complex cell properties from the slowness principle Proc. 15th Annual Computational Neuroscience Meeting (CNS'06), Jul 16-20, Edinburgh, Scotland .
     
    inproceedings SFA: Theory of complex cells (2004-2007)
    BibTeX:
    			
                            @inproceedings{SprekelerWiskott-2006b,
                              author       = {Sprekeler, Henning and Wiskott, Laurenz},
                              title        = {Analytical derivation of complex cell properties from the slowness principle},
                              booktitle    = {Proc.\ 15th Annual Computational Neuroscience Meeting (CNS'06), Jul 16--20, Edinburgh, Scotland},
                              year         = {2006}
                            }
    			
    					
    Sprekeler, H. & Wiskott, L. 2006 Analytical derivation of complex cell properties from the slowness principle Proc. Conference on Mathematical Neuroscience (NEUROMATH 06), Sep 1-4, Andorra , 62 .
     
    inproceedings SFA: Theory of complex cells (2004-2007)
    BibTeX:
    			
                            @inproceedings{SprekelerWiskott-2006c,
                              author       = {Henning Sprekeler and Laurenz Wiskott},
                              title        = {Analytical derivation of complex cell properties from the slowness principle},
                              booktitle    = {Proc. Conference on Mathematical Neuroscience (NEUROMATH 06), Sep 1-4, Andorra},
                              year         = {2006},
                              pages        = {62}
                            }
    			
    					
    Sprekeler, H. & Wiskott, L. 2006 Analytical derivation of complex cell properties from the slowness principle Proc. 2nd Bernstein Symposium for Computational Neuroscience, Oct 1-3, Berlin, Germany , 67 .
     
    inproceedings SFA: Theory of complex cells (2004-2007)
    BibTeX:
    			
                            @inproceedings{SprekelerWiskott-2006d,
                              author       = {Henning Sprekeler and Laurenz Wiskott},
                              title        = {Analytical derivation of complex cell properties from the slowness principle},
                              booktitle    = {Proc.\ 2nd Bernstein Symposium for Computational Neuroscience, Oct 1--3, Berlin, Germany},
                              publisher    = {Bernstein Center for Computational Neuroscience (BCCN) Berlin},
                              year         = {2006},
                              pages        = {67}
                            }
    			
    					
    Sprekeler, H. & Wiskott, L. 2007 Spike-timing-dependent plasticity and temporal input statistics Proc. 16th Annual Computational Neuroscience Meeting (CNS'07), Jul 7-12, Toronto, Canada .
    (Special issue of BMC Neuroscience 8(Suppl 2):P86)  
    inproceedings SFA and STDP (2003-2006)
    BibTeX:
    			
                            @inproceedings{SprekelerWiskott-2007,
                              author       = {H. Sprekeler and L. Wiskott},
                              title        = {Spike-timing-dependent plasticity and temporal input statistics},
                              booktitle    = {Proc.\ 16th Annual Computational Neuroscience Meeting (CNS'07), Jul 7--12, Toronto, Canada},
                              year         = {2007},
    			  
    			  url          = {http://www.biomedcentral.com/1471-2202/8/S2/P86},
    			  
                              doi          = {http://doi.org/10.1186/1471-2202-8-s2-p86}
                            }
    			
    					
    Sprekeler, H. & Wiskott, L. 2008 Understanding Slow Feature Analysis: a mathematical framework Cognitive Sciences EPrint Archive (CogPrints) , 6223 .
     
    misc Extended slow feature analysis (xSFA) (2006-2013)
    Abstract: Slow feature analysis is an algorithm for unsupervised learning of invariant representations from data with temporal correlations. Here, we present a mathematical analysis of slow feature analysis for the case where the input-output functions are not restricted in complexity. We show that the optimal functions obey a partial differential eigenvalue problem of a type that is common in theoretical physics. This analogy allows the transfer of mathematical techniques and intuitions from physics to concrete applications of slow feature analysis, thereby providing the means for analytical predictions and a better understanding of simulation results. We put particular emphasis on the situation where the input data are generated from a set of statistically independent sources. The dependence of the optimal functions on the sources is calculated analytically for the cases where the sources have Gaussian or uniform distribution.
    BibTeX:
    			
                            @misc{SprekelerWiskott-2008,
                              author       = {Henning Sprekeler and Laurenz Wiskott},
                              title        = {Understanding {S}low {F}eature {A}nalysis: a mathematical framework},
                              year         = {2008},
                              volume       = {6223},
                              howpublished = {Cognitive Sciences EPrint Archive (CogPrints)},
    			  
    			  url          = {http://cogprints.org/6223/}
                            }
    			
    					
    Sprekeler, H. & Wiskott, L. 2011 A theory of Slow Feature Analysis for transformation-based input signals with an application to complex cells Neural Computation , 23(2) , 303-335 .
     
    article SFA: Theory of complex cells (2004-2007)
    Abstract: We develop a group theoretical analysis of slow feature analysis for the case where the input data are generated by applying a set of continuous transformations to static templates. As an application of the theory, we analytically derive nonlinear visual receptive fields and show that their optimal stimuli as well as the orientation and frequency tuning are in good agreement with previous simulations of complex cells in primary visual cortex (Berkes and Wiskott, 2005). The theory suggests that side- and end-stopping can be interpreted as a weak breaking of translation invariance. Direction selectivity is also discussed.
    BibTeX:
    			
                            @article{SprekelerWiskott-2011,
                              author       = {Henning Sprekeler and Laurenz Wiskott},
                              title        = {A theory of {S}low {F}eature {A}nalysis for transformation-based input signals with an application to complex cells},
                              journal      = {Neural Computation},
                              year         = {2011},
                              volume       = {23},
                              number       = {2},
                              pages        = {303--335},
    			  
    			  url          = {http://www.mitpressjournals.org/doi/abs/10.1162/NECO_a_00072},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/SprekelerWiskott-2011-NeurComp-SFATheoryRFs.pdf},
    			  
                              doi          = {http://doi.org/10.1162/NECO_a_00072}
                            }
    			
    					
    Sprekeler, H.; Zito, T. & Wiskott, L. 2010 An extension of Slow Feature Analysis for nonlinear blind source separation Cognitive Sciences EPrint Archive (CogPrints) , 7056 .
     
    misc Extended slow feature analysis (xSFA) (2006-2013)
    BibTeX:
    			
                            @misc{SprekelerZitoEtAl-2010,
                              author       = {Henning Sprekeler and Tiziano Zito and Laurenz Wiskott},
                              title        = {An extension of {S}low {F}eature {A}nalysis for nonlinear blind source separation},
                              year         = {2010},
                              volume       = {7056},
                              howpublished = {Cognitive Sciences EPrint Archive (CogPrints)},
    			  
    			  url          = {http://cogprints.org/7056/},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/SprekelerZitoEtAl-2010-CogPrints-xSFA.pdf}
                            }
    			
    					
    Sprekeler, H.; Zito, T. & Wiskott, L. 2014 An extension of Slow Feature Analysis for nonlinear blind source separation Journal of Machine Learning Research , 15 , 921-947 .
     
    article Extended slow feature analysis (xSFA) (2006-2013)
    Abstract: We present and test an extension of slow feature analysis as a novel approach to nonlinear blind source separation. The algorithm relies on temporal correlations and iteratively reconstructs a set of statistically independent sources from arbitrary nonlinear instantaneous mixtures. Simulations show that it is able to invert a complicated nonlinear mixture of two audio signals with a high reliability. The algorithm is based on a mathematical analysis of slow feature analysis for the case of input data that are generated from statistically independent sources.
    BibTeX:
    			
                            @article{SprekelerZitoEtAl-2014,
                              author       = {Henning Sprekeler and Tiziano Zito and Laurenz Wiskott},
                              title        = {An extension of {S}low {F}eature {A}nalysis for nonlinear blind source separation},
                              journal      = {Journal of Machine Learning Research},
                              year         = {2014},
                              volume       = {15},
                              pages        = {921--947},
    			  
    			  url          = {http://jmlr.org/papers/v15/sprekeler14a.html},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/SprekelerZitoEtAl-2014-JMLR-xSFA.pdf}
                            }
    			
    					
    Walther, T.; Diekmann, N.; Vijayabaskaran, S.; Donoso, J.R.; Manahan-Vaughan, D.; Wiskott, L. & Cheng, S. 2021 Context-dependent extinction learning emerging from raw sensory inputs: a reinforcement learning approach Scientific Reports , 11(1) .
     
    article
    BibTeX:
    			
                            @article{WaltherDiekmannVijayabaskaranEtAl2021,
                              author       = {Walther, Thomas and Diekmann, Nicolas and Vijayabaskaran, Sandhiya and Donoso, Jos� R. and Manahan-Vaughan, Denise and Wiskott, Laurenz and Cheng, Sen},
                              title        = {Context-dependent extinction learning emerging from raw sensory inputs: a reinforcement learning approach},
                              journal      = {Scientific Reports},
                              year         = {2021},
                              volume       = {11},
                              number       = {1},
    			  
                              doi          = {http://doi.org/10.1038/s41598-021-81157-z}
                            }
    			
    					
    Wang, N. 2014 Learning natural image statistics with variants of Restricted Boltzmann Machines International Graduate School of Neuroscience, Ruhr-Universität Bochum , International Graduate School of Neuroscience, Ruhr-Universität Bochum .
     
    phdthesis RBM: Modeling natural images (2012-2014), RBM: Spontaneous correlations (2013-2014)
    BibTeX:
    			
                            @phdthesis{Wang-2014,
                              author       = {Nan Wang},
                              title        = {Learning natural image statistics with variants of {R}estricted {B}oltzmann {M}achines},
                              school       = {International Graduate School of Neuroscience, Ruhr-Universität Bochum},
                              year         = {2014},
    			  
    			  url          = {http://hss-opus.ub.ruhr-uni-bochum.de/opus4/frontdoor/index/index/docId/4619}
                            }
    			
    					
    Wang, N.; Jancke, D. & Wiskott, L. 2013 Modeling correlations in spontaneous activity of visual cortex with centered Gaussian-binary Deep Boltzmann Machines CoRR e-print arXiv:1312.6108 .
    ((latest version 2014-02-17 v3))  
    misc RBM: Spontaneous correlations (2013-2014)
    Abstract: Spontaneous cortical activity -- the ongoing cortical activities in absence of intentional sensory input -- is considered to play a vital role in many aspects of both normal brain functions and mental dysfunctions. We present a centered Gaussian-binary Deep Boltzmann Machine (GDBM) for modeling the activity in early cortical visual areas and relate the random sampling in GDBMs to the spontaneous cortical activity. After training the proposed model on natural image patches, we show that the samples collected from the model's probability distribution encompass similar activity patterns as found in the spontaneous activity. Specifically, filters having the same orientation preference tend to be active together during random sampling. Our work demonstrates the centered GDBM is a meaningful model approach for basic receptive field properties and the emergence of spontaneous activity patterns in early cortical visual areas. Besides, we show empirically that centered GDBMs do not suffer from the difficulties during training as GDBMs do and can be properly trained without the layer-wise pretraining.
    BibTeX:
    			
                            @misc{WangJanckeEtAl-2013,
                              author       = {Nan Wang and Dirk Jancke and Laurenz Wiskott},
                              title        = {Modeling correlations in spontaneous activity of visual cortex with centered {G}aussian-binary {D}eep {B}oltzmann {M}achines},
                              journal      = {CoRR},
                              year         = {2013},
                              howpublished = {e-print arXiv:1312.6108},
    			  
    			  url          = {https://arxiv.org/abs/1312.6108},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/WangJanckeEtAl-2013-arXiveV3-SpontaneousCorr.pdf}
                            }
    			
    					
    Wang, N.; Jancke, D. & Wiskott, L. 2014 Modeling correlations in spontaneous activity of visual cortex with centered Gaussian-binary Deep Boltzmann Machines Proc. International Conference of Learning Representations (ICLR'14, workshop), Apr 14-16,Banff, Alberta, Canada .
     
    inproceedings RBM: Spontaneous correlations (2013-2014)
    BibTeX:
    			
                            @inproceedings{WangJanckeEtAl-2014a,
                              author       = {Nan Wang and Dirk Jancke and Laurenz Wiskott},
                              title        = {Modeling correlations in spontaneous activity of visual cortex with centered {G}aussian-binary {D}eep {B}oltzmann {M}achines},
                              booktitle    = {Proc.\ International Conference of Learning Representations (ICLR'14, workshop), Apr 14--16,Banff, Alberta, Canada},
                              year         = {2014},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/WangJanckeEtAl-2014a-ProcICLR-SpontaneousCorr-outdatedVersion-seeArXivForUpdate.pdf}
                            }
    			
    					
    Wang, N.; Jancke, D. & Wiskott, L. 2014 Modeling correlations in spontaneous activity of visual cortex with Gaussian-binary Deep Boltzmann Machines Proc. Bernstein Conference for Computational Neuroscience, Sep 3-5,Göttingen, Germany , 263-264 .
     
    inproceedings RBM: Spontaneous correlations (2013-2014)
    BibTeX:
    			
                            @inproceedings{WangJanckeEtAl-2014b,
                              author       = {Nan Wang and Dirk Jancke and Laurenz Wiskott},
                              title        = {Modeling correlations in spontaneous activity of visual cortex with {G}aussian-binary {D}eep {B}oltzmann {M}achines},
                              booktitle    = {Proc.\ Bernstein Conference for Computational Neuroscience, Sep 3--5,Göttingen, Germany},
                              publisher    = {BFNT Göttingen},
                              year         = {2014},
                              pages        = {263--264}
                            }
    			
    					
    Wang, N.; Melchior, J. & Wiskott, L. 2012 An analysis of Gaussian-binary Restricted Boltzmann Machines for natural images Proc. 20th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, Apr 25-27, Bruges, Belgium , 287-292 .
     
    inproceedings RBM: Modeling natural images (2012-2014)
    BibTeX:
    			
                            @inproceedings{WangMelchiorEtAl-2012a,
                              author       = {Nan Wang and Jan Melchior and Laurenz Wiskott},
                              title        = {An analysis of {G}aussian-binary {R}estricted {B}oltzmann {M}achines for natural images},
                              booktitle    = {Proc.\ 20th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, Apr 25--27, Bruges, Belgium},
                              year         = {2012},
                              pages        = {287--292},
    			  
    			  url          = {https://pdfs.semanticscholar.org/f3f7/34d9cc49c8ee8573c7712ae562f79c795dc4.pdf},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/WangMelchiorEtAl-2012a-ProcESANN-RBMImages.pdf}
                            }
    			
    					
    Wang, N.; Melchior, J. & Wiskott, L. 2014 Gaussian-binary Restricted Boltzmann Machines on modeling natural image statistics CoRR e-print arXiv:1401.5900 .
     
    misc RBM: Modeling natural images (2012-2014)
    BibTeX:
    			
                            @misc{WangMelchiorEtAl-2014,
                              author       = {Nan Wang and Jan Melchior and Laurenz Wiskott},
                              title        = {{G}aussian-binary {R}estricted {B}oltzmann {M}achines on modeling natural image statistics},
                              journal      = {CoRR},
                              year         = {2014},
                              howpublished = {e-print arXiv:1401.5900},
    			  
    			  url          = {https://arxiv.org/abs/1401.5900}
                            }
    			
    					
    Weghenkel, B. 2019 Unsupervised extraction of predictable features from high-dimensional time series phdthesis, Ruhr-Universität Bochum , Ruhr-Universität Bochum .
     
    phdthesis
    Abstract: For every agent (biological or artificial) that pursues a goal in an interactive environment, it is of central importance to orient its information processing towards those aspects of the
    environment that are predictive in the sense of containing information about the agent’s future. Starting from this premise, this dissertation explores two ideas: Firstly, to use predictability objectives for
    the development of new unsupervised learning algorithms; and secondly, to explore if predictability can serve as a fundamental principle in the modeling of neural information processing—as it has been
    demonstrated for principles like slowness, sparseness, and others before.

    With respect to the first question, two new, unsupervised learning algorithms are proposed. The first—graph-based predictable feature analysis (GPFA)—adopts the paradigm of slow feature analysis (SFA) in the
    sense that it takes high-dimensional time series as its input to extract the most predictable subspaces or (non-linear) predictable features from it. The second algorithm is designed to discretize a
    continuous-valued environment or state space (like it is found in a reinforcement learning setting) such that the resulting states are predictive of future states.

    An experimental investigation further explores the behavior of SFA and three comparable approaches to predictable feature learning on a diverse collection of real-world datasets. It comes to the conclusion that
    even though it was not intended as one, in practice SFA serves as a surprisingly effective approach to learn predictable features according to three different measures and, accordingly, that previous results of
    SFA as a model for neural information processing can already be understood as implementations of a predictability objective.

    BibTeX:
    			
                            @phdthesis{Weghenkel-2019,
                              author       = {Bj{\"o}rn Weghenkel},
                              title        = {Unsupervised extraction of predictable features from high-dimensional time series},
                              school       = {Ruhr-Universit{\"a}t Bochum},
                              year         = {2019},
    			  
    			  url          = {https://doi.org/10.13154/294-6819},
    			  
                              doi          = {http://doi.org/10.13154/294-6819}
                            }
    			
    					
    Weghenkel, B.; Fischer, A. & Wiskott, L. 2016 Graph-based Predictable Feature Analysis CoRR e-print arXiv:1602.00554v1 .
    (Preprint)  
    misc Predictable Feature Analysis (2010-now)
    BibTeX:
    			
                            @misc{WeghenkelFischerEtAl-2016,
                              author       = {Björn Weghenkel and Asja Fischer and Laurenz Wiskott},
                              title        = {Graph-based {P}redictable {F}eature {A}nalysis},
                              journal      = {CoRR},
                              year         = {2016},
                              howpublished = {e-print arXiv:1602.00554v1},
    			  
    			  url          = {https://arxiv.org/abs/1602.00554v1}
                            }
    			
    					
    Weghenkel, B.; Fischer, A. & Wiskott, L. 2017 Graph-based Predictable Feature Analysis CoRR e-print arXiv:1602.00554v2 .
    (Preprint)  
    misc Predictable Feature Analysis (2010-now)
    BibTeX:
    			
                            @misc{WeghenkelFischerEtAl-2017a,
                              author       = {Björn Weghenkel and Asja Fischer and Laurenz Wiskott},
                              title        = {Graph-based {P}redictable {F}eature {A}nalysis},
                              journal      = {CoRR},
                              year         = {2017},
                              howpublished = {e-print arXiv:1602.00554v2},
    			  
    			  url          = {https://arxiv.org/abs/1602.00554v2}
                            }
    			
    					
    Weghenkel, B.; Fischer, A. & Wiskott, L. 2017 Graph-based predictable feature analysis Machine Learning The European Conference on Machine Learning (ECML) & Principles and Practice of Knowledge Discovery in Databases (PKDD) 2017, Sep 18-22, Skopje, Macedonia , 106(9) , 1359-1380 .
    (Special issue of Machine Learning, 106(9):1359--1380)  
    article Predictable Feature Analysis (2010-now)
    Abstract: We propose graph-based predictable feature analysis (GPFA), a new method for unsupervised learning of predictable features from high-dimensional time series, where high predictability is understood very generically as low variance in the distribution of the next data point given the previous ones. We show how this measure of predictability can be understood in terms of graph embedding as well as how it relates to the information-theoretic measure of predictive information in special cases. We confirm the effectiveness of GPFA on different datasets, comparing it to three existing algorithms with similar objectives---namely slow feature analysis, forecastable component analysis, and predictable feature analysis---to which GPFA shows very competitive results.
    BibTeX:
    			
                            @article{WeghenkelFischerEtAl-2017b,
                              author       = {Weghenkel, Bj{\"o}rn and Fischer, Asja and Wiskott, Laurenz},
                              title        = {Graph-based predictable feature analysis},
                              booktitle    = {The European Conference on Machine Learning (ECML) \& Principles and Practice of Knowledge Discovery in Databases (PKDD) 2017, Sep 18--22, Skopje, Macedonia},
                              journal      = {Machine Learning},
                              year         = {2017},
                              volume       = {106},
                              number       = {9},
                              pages        = {1359--1380},
    			  
    			  url          = {http://rdcu.be/vDA},
    			  
                              doi          = {http://doi.org/10.1007/s10994-017-5632-x}
                            }
    			
    					
    Weghenkel, B. & Wiskott, L. 2014 Learning predictive partitions for continuous feature spaces Proc. 22nd European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN), Apr 23-25, Bruges, Belgium , 577-582 .
     
    inproceedings Predictable Feature Analysis (2010-now)
    Abstract: Any non-trivial agent (biological or algorithmical) that interacts with its environment needs some representation about its current state. Such a state should enable it to make informed decisions that lead to some desired outcome in the future. In practice, many learning algorithms assume states to come from a discrete set while real-world learning problems often are continuous in nature. We propose an unsupervised learning algorithm that finds discrete partitions of a continuous feature space that are predictive with respect to the future. More precisely, the learned partitions induce a Markov chain on the data with high mutual information between the current state and the next state. Such predictive partitions can serve as an alternative to classical discretization algorithms in cases where the predictable time-structure of the data is of importance.
    BibTeX:
    			
                            @inproceedings{WeghenkelWiskott-2014,
                              author       = {Björn Weghenkel and Laurenz Wiskott},
                              title        = {Learning predictive partitions for continuous feature spaces},
                              booktitle    = {Proc.\ 22nd European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN), Apr 23--25, Bruges, Belgium},
                              year         = {2014},
                              pages        = {577--582},
    			  
    			  url          = {https://pdfs.semanticscholar.org/b0a0/52aedccc0e421ea2cdbfca9c4c4625cb4bde.pdf},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/WeghenkelWiskott-2014-ESANN-Preprint.pdf},
                              url3         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/WeghenkelWiskott-2014-ESANN-Preprint.pdf}
                            }
    			
    					
    Weghenkel, B. & Wiskott, L. 2018 Slowness as a Proxy for Temporal Predictability: An Empirical Comparison Neural computation , 30(5) , 1151-1179 .
     
    article
    Abstract: The computational principles of slowness and predictability have been proposed to describe aspects of information processing in the visual system. From the perspective of slowness being a limited special case of predictability we investigate the relationship between these two principles empirically. On a collection of real-world data sets we compare the features extracted by slow feature analysis (SFA) to the features of three recently proposed methods for predictable feature extraction: forecastable component analysis, predictable feature analysis, and graph-based predictable feature analysis. Our experiments show that the predictability of the learned features is highly correlated, and, thus, SFA appears to effectively implement a method for extracting predictable features according to different measures of predictability.
    BibTeX:
    			
                            @article{WeghenkelWiskott-2018,
                              author       = {Weghenkel, Bj{\"o}rn and Wiskott, Laurenz},
                              title        = {Slowness as a Proxy for Temporal Predictability: An Empirical Comparison},
                              journal      = {Neural computation},
                              publisher    = {MIT Press},
                              year         = {2018},
                              volume       = {30},
                              number       = {5},
                              pages        = {1151--1179},
    			  
                              doi          = {http://doi.org/10.1162/neco_a_01070}
                            }
    			
    					
    Wicklein, M.; Strausfeld, N.J.; Sejnowski, T.; Sabes, P. & Wiskott, L. 1998 Looming sensitivity in hummingbird hawkmoths: neurons and models Proc. Society for Neuroscience 28th Annual Meeting , 24 , 188 .
     
    inproceedings N.N.
    Abstract: Intracellular recordings in Manduca sexta (Sphingidae, Lepidoptera) identified a class of wide-field neuron that responded selectively to looming or receding stimuli. Clockwise and counter-clockwise rotating spirals and expanding or contracting discs simulated looming and anti-looming. Both spirals and discs provide the eye with outwardly or inwardly moving edges, while the spiral simulates looming or anti-looming maintaining a constant area, perimeter length, and luminance on the retina. Type 1 cells are activated only by the disc and not the spiral, effectively distinguishing expansion from contraction by measuring perimeter length. The cell class is futher divided: Type 1a neurons responded to looming and were inhibited by image size decrease (anti-looming) whereas type 1b neurons were activated by anti-looming and inhibited by looming. The proposed model for the type 1 neurons requires them to be mutually inhibited, while being fed by two systems of retinotopically organized directional insensitive and motion sensitive edge-detectors through an intermediate level of elements that either preserve or invert the signal. The first of the two systems provides an excitatory output on the type 1a neuron, the latter inhibits the type 1b neuron. As the edge of the looming stimulus expands on the retina there is a recruitment of sequentially stimulated edge detectors increasing the depolarization of the class 1a cell and increasing the inhibition of the class 1b cell. The opposite occurs with reversed stimulus: excitation of the class 1a cell diminishes, while the class 1b cell is gradually released from inhibition. Reciprocal inhibition occurs between the class 1a and class 1b neurons to provide the observed responses of excitation to looming and inhibition to anti-looming or the reverse. We implemented the proposed model camparing the performance of the real curcuit with the model. The model proves to be able to simulate the essential features of the neuronal circuit.
    BibTeX:
    			
                            @inproceedings{WickleinStrausfeldEtAl-1998,
                              author       = {Martina Wicklein and N. J. Strausfeld and Terrence Sejnowski and P. Sabes and Laurenz Wiskott},
                              title        = {Looming sensitivity in hummingbird hawkmoths: neurons and models},
                              booktitle    = {Proc.\ Society for Neuroscience 28th Annual Meeting},
                              year         = {1998},
                              volume       = {24},
                              pages        = {188}
                            }
    			
    					
    Wilbert, N. 2012 Hierarchical Slow Feature Analysis on visual stimuli and top-down reconstruction PhD thesis, Institute for Biology , Humboldt University Berlin, Germany .
     
    phdthesis MDP: Modular toolkit for data processing (2003-now), SFA: Learning visual invariances II (2006-2009), SFA and RL on visual input (2008,2009), N.N.
    Abstract: In dieser Dissertation wird ein Modell des visuellen Systems untersucht, basierend auf dem Prinzip des unüberwachten Langsamkeitslernens und des SFA-Algorithmus (Slow Feature Analysis). Dieses Modell wird hier für die invariante Objekterkennung und verwandte Probleme eingesetzt. Das Modell kann dabei sowohl die zu Grunde liegenden diskreten Variablen der Stimuli extrahieren (z.B. die Identität des gezeigten Objektes) als auch kontinuierliche Variablen (z.B. Position und Rotationswinkel). Dabei ist es in der Lage, mit komplizierten Transformationen umzugehen, wie beispielsweise Tiefenrotation. Die Leistungsfähigkeit des Modells wird zunächst mit Hilfe von überwachten Methoden zur Datenanalyse untersucht. Anschließend wird gezeigt, dass auch die biologisch fundierte Methode des Verstärkenden Lernens (reinforcement learning) die Ausgabedaten unseres Modells erfolgreich verwenden kann. Dies erlaubt die Anwendung des Verstärkenden Lernens auf hochdimensionale visuelle Stimuli. Im zweiten Teil der Arbeit wird versucht, das hierarchische Modell mit Top-down Prozessen zu erweitern, speziell für die Rekonstruktion von visuellen Stimuli. Dabei setzen wir die Methode der Vektorquantisierung ein und verbinden diese mit einem Verfahren zum Gradientenabstieg. Die wesentlichen Komponenten der für unsere Simulationen entwickelten Software wurden in eine quelloffene Programmbibliothek integriert, in das ``Modular toolkit for Data Processing'''' (MDP). Diese Programmkomponenten werden im letzten Teil der Dissertation vorgestellt.
    BibTeX:
    			
                            @phdthesis{Wilbert-2012,
                              author       = {Niko Wilbert},
                              title        = {Hierarchical {S}low {F}eature {A}nalysis on visual stimuli and top-down reconstruction},
                              school       = {Institute for Biology},
                              year         = {2012},
    			  
    			  url          = {http://edoc.hu-berlin.de/docviews/abstract.php?id=39426},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/Wilbert-2012-PhDThesis.pdf},
    			  
                              doi          = {http://doi.org/10.18452/16526}
                            }
    			
    					
    Wilbert, N.; Franzius, M.; Cichy, R.; Schmidt, S.; Brandt, S. & Wiskott, L. 2007 Towards a model of visual attention Proc. Midterm Evaluation of the German National Network for Computational Neuroscience, Dec 3-4, Berlin, Germany , 30 .
     
    inproceedings SFA: Learning visual invariances II (2006-2009)
    BibTeX:
    			
                            @inproceedings{WilbertFranziusEtAl-2007,
                              author       = {Niko Wilbert and Mathias Franzius and Radoslaw Cichy and Sein Schmidt and Stephan Brandt and Laurenz Wiskott},
                              title        = {Towards a model of visual attention},
                              booktitle    = {Proc.\ Midterm Evaluation of the German National Network for Computational Neuroscience, Dec 3--4, Berlin, Germany},
                              year         = {2007},
                              pages        = {30}
                            }
    			
    					
    Wilbert, N.; Legenstein, R.; Franzius, M. & Wiskott, L. 2009 Reinforcement learning on complex visual stimuli Proc. 18th Annual Computational Neuroscience Meeting (CNS'09), Jul 18-23, Berlin, Germany .
    (Special issue of BMC Neuroscience 10(Suppl 1):P90)  
    inproceedings SFA and RL on visual input (2008,2009)
    BibTeX:
    			
                            @inproceedings{WilbertLegensteinEtAl-2009,
                              author       = {Niko Wilbert and Robert Legenstein and Mathias Franzius and Laurenz Wiskott},
                              title        = {Reinforcement learning on complex visual stimuli},
                              booktitle    = {Proc.\ 18th Annual Computational Neuroscience Meeting (CNS'09), Jul 18--23, Berlin, Germany},
                              year         = {2009},
    			  
    			  url          = {http://www.biomedcentral.com/1471-2202/10/S1/P90},
    			  
                              doi          = {http://doi.org/10.1186/1471-2202-10-S1-P90}
                            }
    			
    					
    Wilbert, N. & Wiskott, L. 2010 Hierarchical Slow Feature Analysis and top-down processes Proc. Bernstein Conference on Computational Neuroscience, Sep 27-Oct 1, Berlin, Germany .
    (Event abstract in Frontiers in Computational Neuroscience)  
    inproceedings N.N.
    Abstract: Top-down processes are thought to play an important role in the mammalian visual system, e.g., for interpreting ambiguous stimuli. Slow Feature Analysis (SFA) [2] on the other hand is proven to be an efficient algorithm for the bottom-up processing of visual stimuli [2][3]. Therefore it seems natural to combine bottom-up SFA with top-down processes. SFA is an unsupervised learning algorithm that leverages the time structure of incoming stimuli to extract higher-level features. The SFA algorithm works with continuous, real variables. The algorithm itself is linear, but can be combined with a prior expansion into a more powerful function space. Quadratic polynomials have been used successfully in hierarchical networks for the extraction of high-level features from complex visual stimuli. Unfortunately this expansion makes it difficult to relate input and output components in the layers. In particular it is generally not possible to invert the bottom-up mapping, which indicates serious obstacles for top-down processes. We explored techniques to address this inversion problem. Our methods combine gradient decent and vector quantization algorithms and allowed stimulus reconstruction at the lowest layer (see Fig. 1). The results also suggest that a further increase in reconstruction performance will require a different expansion that is partly optimized for the top-down step. [Figure] Figure 1. Stimulus reconstruction from higher-level features. (a) shows the reconstruction for a single receptive field patch on the lowest layer, with complex cell like output behavior. On the left is the original stimulus, on the right side the reconstruction, which was calculated from the layer output. In (b) the same reconstruction technique has been applied to a whole image. The first picture is the original stimulus, the second one is the reconstruction from the lowest layer output. The third image is the reconstruction from the second layer output, showing some significant reconstruction errors. References 1. Wiskott L, Sejnowski TJ: Slow feature analysis: Unsupervised learning of invariances. Neural Computation 2002; 14(4):715-770. 2. Franzius M, Sprekeler H, and Wiskott L: Slowness and sparseness lead to place, head-diretion and spatial-view cells. Public Library of Science (PLoS) Computational Biology, 3(8):e166, 2007. 3. Franzius M, Wilbert N, and Wiskott L: Invariant Object Recognition with Slow Feature Analysis. Proc. 18th Int'l Conf. on Artificial Neural Networks, ICANN'08, Prague, Sep 3-6, eds. Vera Kurková and Roman Neruda and Jan Koutník, publ. Springer-Verlag, pp. 961-970.
    BibTeX:
    			
                            @inproceedings{WilbertWiskott-2010,
                              author       = {N. Wilbert and L. Wiskott},
                              title        = {Hierarchical {S}low {F}eature {A}nalysis and top-down processes},
                              booktitle    = {Proc.\ Bernstein Conference on Computational Neuroscience, Sep 27--Oct 1, Berlin, Germany},
                              year         = {2010},
    			  
    			  url          = {http://www.frontiersin.org/Community/AbstractDetails.aspx?ABS_DOI=10.3389/conf.fncom.2010.51.00119},
                              url3         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/WilbertWiskott-2010-ProcBCCNBerlin-Poster-SFATopDown.pdf},
    			  
                              doi          = {http://doi.org/10.3389/conf.fncom.2010.51.00119}
                            }
    			
    					
    Wilbert, N.; Zito, T.; Schuppner, R.-B.; Jedrzejewski-Szmek, Z.; Wiskott, L. & Berkes, P. 2013 Building extensible frameworks for data processing: the case of MDP, Modular toolkit for Data Processing Journal of Computational Science , 4(5) , 345-351 .
     
    article MDP: Modular toolkit for data processing (2003-now)
    Abstract: Data processing is a ubiquitous task in scientific research, and much energy is spent on the development of appropriate algorithms. It is thus relatively easy to find software implementations of the most common methods. On the other hand, when building concrete applications, developers are often confronted with several additional chores that need to be carried out beside the individual processing steps. These include for example training and executing a sequence of several algorithms, writing code that can be executed in parallel on several processors, or producing a visual description of the application. The Modular toolkit for Data Processing (MDP) is an open source Python library that provides an implementation of several widespread algorithms and offers a unified framework to combine them to build more complex data processing architectures. In this paper we concentrate on some of the newer features of MDP, focusing on the choices made to automatize repetitive tasks for users and developers. In particular, we describe the support for parallel computing and how this is implemented via a flexible extension mechanism. We also briefly discuss the support for algorithms that require bi-directional data flow.
    BibTeX:
    			
                            @article{WilbertZitoEtAl-2013,
                              author       = {Niko Wilbert and Tiziano Zito and Rike-Benjamin Schuppner and Zbigniew Jedrzejewski-Szmek and Laurenz Wiskott and Pietro Berkes},
                              title        = {Building extensible frameworks for data processing: the case of {MDP}, {M}odular toolkit for {D}ata {P}rocessing},
                              journal      = {Journal of Computational Science},
                              year         = {2013},
                              volume       = {4},
                              number       = {5},
                              pages        = {345--351},
    			  
    			  url          = {http://www.sciencedirect.com/science/article/pii/S1877750311000913},
    			  
                              doi          = {http://doi.org/10.1016/j.jocs.2011.10.005}
                            }
    			
    					
    Wiskott, L. 2006 How does our visual system achieve shift and size invariance? Chapter 16 in 23 Problems in Systems Neuroscience , 322-340 .
     
    incollection Visual invariances: a review (2000)
    BibTeX:
    			
                            @incollection{Wiskott-2006b,
                              author       = {Laurenz Wiskott},
                              title        = {How does our visual system achieve shift and size invariance?},
                              booktitle    = {23 Problems in Systems Neuroscience},
                              publisher    = {Oxford University Press},
                              year         = {2006},
                              pages        = {322--340},
    			  
    			  url          = {http://onlinelibrary.wiley.com/doi/10.1002/hipo.20167/abstract},
    			  
                              doi          = {http://doi.org/10.1093/acprof:oso/9780195148220.003.0016}
                            }
    			
    					
    Wiskott, L. 2001 Some ideas about organic computing Preproc. Organic Computing: Towards Structured Design of Processes, Nov 23-24, Paderborn, Germany , 39-42 .
     
    inproceedings N.N.
    Abstract: Well, I have been thinking about a possible position statement for the symposium on organic computing for quite some time now, but I still feel that I cannot provide any particularly qualified text. So be warned that this is a rather naive statement, in the sense that I have no experience in hardware issues of organic computing and that I did not take the time to educate myself by reading related papers. I have some experience in neuroinformatics, but I don't feel like speculating on that, since, as far as I understand, the focus of the symposium is on combining and communicating between these fields rather than pushing forward any single discipline. I also feel that ethical issues should be raised, although I have not much experience on that either. So here are some unqualified speculations and fragments on hardware-software and ethical issues of organic computing in the (partly rather far) future. Maybe they will induce some qualified thoughts on the reader's side.
    BibTeX:
    			
                            @inproceedings{Wiskott-2001b,
                              author       = {Laurenz Wiskott},
                              title        = {Some ideas about organic computing},
                              booktitle    = {Preproc. Organic Computing: Towards Structured Design of Processes, Nov 23-24, Paderborn, Germany},
                              year         = {2001},
                              pages        = {39--42},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/Wiskott-2001b-ProcOrganicComputing-Preprint.txt}
                            }
    			
    					
    Wiskott, L. 2013 Slow Feature Analysis Encyclopedia of Computational Neuroscience .
     
    incollection SFA: Learning visual invariances I (1997-1999), SFA: Complex cells (2001-2003), SFA: Place cells I (2003-2007), SFA: Theory of complex cells (2004-2007), SFA: Learning visual invariances II (2006-2009)
    BibTeX:
    			
                            @incollection{Wiskott-2013,
                              author       = {Laurenz Wiskott},
                              title        = {Slow {F}eature {A}nalysis},
                              booktitle    = {Encyclopedia of Computational Neuroscience},
                              publisher    = {Springer-Verlag Berlin Heidelberg},
                              year         = {2013},
    			  
    			  url          = {http://www.springerreference.com/docs/html/chapterdbid/348747.html},
    			  
                              doi          = {http://doi.org/10.1007/978-1-4614-6675-8_682}
                            }
    			
    					
    Wiskott, L. 2001 Unsupervised learning of invariances in a simple model of the visual system Proc. The Mathematical, Computational and Biological Study of Vision, Nov 4-10, Oberwolfach , 21-22 .
    (Report No. 49/2001)  
    inproceedings SFA: Learning visual invariances I (1997-1999)
    BibTeX:
    			
                            @inproceedings{Wiskott-2001a,
                              author       = {Laurenz Wiskott},
                              title        = {Unsupervised learning of invariances in a simple model of the visual system},
                              booktitle    = {Proc.\ The Mathematical, Computational and Biological Study of Vision, Nov 4--10, Oberwolfach},
                              publisher    = {Mathematisches Forschungsinstitut Oberwolfach},
                              year         = {2001},
                              pages        = {21--22}
                            }
    			
    					
    Wiskott, L. 1998 Learning invariance manifolds Proc. 8th Intl. Conf. on Artificial Neural Networks (ICANN'98), Skövde, Sweden , Perspectives in Neural Computing , 555-560 .
     
    inproceedings SFA: Learning visual invariances I (1997-1999)
    BibTeX:
    			
                            @inproceedings{Wiskott-1998b,
                              author       = {Laurenz Wiskott},
                              title        = {Learning invariance manifolds},
                              booktitle    = {Proc.\ 8th Intl.\ Conf.\ on Artificial Neural Networks (ICANN'98), Skövde, Sweden},
                              publisher    = {Springer},
                              year         = {1998},
                              pages        = {555--560},
    			  
    			  url          = {https://link.springer.com/chapter/10.1007/978-1-4471-1599-1_83},
    			  
                              doi          = {http://doi.org/10.1007/978-1-4471-1599-1_83}
                            }
    			
    					
    Wiskott, L. 1999 Unsupervised learning and generalization of translation invariance in a simple model of the visual system Learning and Adaptivity for Connectionist Models and Neural Networks, Proc. Meeting of the GI-Working Group 1.1.2 ``Connectionism'', Sep 29, Magdeburg, Germany , 56-67 .
    (GMD Report 59)  
    inproceedings SFA: Learning visual invariances I (1997-1999)
    BibTeX:
    			
                            @inproceedings{Wiskott-1999c,
                              author       = {Laurenz Wiskott},
                              title        = {Unsupervised learning and generalization of translation invariance in a simple model of the visual system},
                              booktitle    = {Learning and Adaptivity for Connectionist Models and Neural Networks, Proc.\ Meeting of the GI-Working Group 1.1.2 ``Connectionism'', Sep 29, Magdeburg, Germany},
                              publisher    = {GMD-Forschungszentrum Informationstechnik GmbH},
                              year         = {1999},
                              pages        = {56--67}
                            }
    			
    					
    Wiskott, L. 1997 Segmentation from motion: combining Gabor- and Mallat-wavelets to overcome aperture and correspondence problem Proc. 7th Intl. Conf. on Computer Analysis of Images and Patterns (CAIP'97), Kiel, Germany , Lecture Notes in Computer Science (1296) , 329-336 .
     
    inproceedings Segmentation from motion (1993)
    BibTeX:
    			
                            @inproceedings{Wiskott-1997b,
                              author       = {Laurenz Wiskott},
                              title        = {Segmentation from motion: combining {G}abor- and {M}allat-wavelets to overcome aperture and correspondence problem},
                              booktitle    = {Proc.\ 7th Intl.\ Conf.\ on Computer Analysis of Images and Patterns (CAIP'97), Kiel, Germany},
                              publisher    = {Springer-Verlag},
                              year         = {1997},
                              number       = {1296},
                              pages        = {329--336},
    			  
    			  url          = {https://link.springer.com/chapter/10.1007/3-540-63460-6_134},
    			  
                              doi          = {http://doi.org/10.1007/3-540-63460-6_134}
                            }
    			
    					
    Wiskott, L. 1997 Phantom faces for face analysis Proc. 7th Intl. Conf. on Computer Analysis of Images and Patterns (CAIP'97), Kiel, Germany , Lecture Notes in Computer Science (1296) , 480-487 .
     
    inproceedings Face analysis with EBGM (1993-1995)
    BibTeX:
    			
                            @inproceedings{Wiskott-1997c,
                              author       = {Laurenz Wiskott},
                              title        = {Phantom faces for face analysis},
                              booktitle    = {Proc.\ 7th Intl.\ Conf.\ on Computer Analysis of Images and Patterns (CAIP'97), Kiel, Germany},
                              publisher    = {Springer-Verlag},
                              year         = {1997},
                              number       = {1296},
                              pages        = {480--487},
    			  
    			  url          = {https://link.springer.com/chapter/10.1007/3-540-63460-6_153},
    			  
                              doi          = {http://doi.org/10.1007/3-540-63460-6_153}
                            }
    			
    					
    Wiskott, L. 1995 Labeled graphs and Dynamic Link Matching for face recognition and scene analysis , Reihe Physik , 53 .
    ((PhD thesis))  
    book Scene analysis (1992), Face recognition with DLM (1993-1995), Face analysis with EBGM (1993-1995), Face recognition with EBGM (1993,1994)
    BibTeX:
    			
                            @book{Wiskott-1995,
                              author       = {Laurenz Wiskott},
                              title        = {Labeled graphs and {D}ynamic {L}ink {M}atching for face recognition and scene analysis},
                              publisher    = {Verlag Harri Deutsch},
                              year         = {1995},
                              volume       = {53},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/Wiskott-1995-PhDThesis.pdf}
                            }
    			
    					
    Wiskott, L. 1996 Phantom faces for face analysis Technical report , IR-INI 96-06 .
     
    techreport Face analysis with EBGM (1993-1995)
    BibTeX:
    			
                            @techreport{Wiskott-1996a,
                              author       = {Laurenz Wiskott},
                              title        = {Phantom faces for face analysis},
                              publisher    = {Institut für Neuroinformatik},
                              year         = {1996},
                              volume       = {IR-INI 96-06},
                              howpublished = {Technical report}
                            }
    			
    					
    Wiskott, L. 1996 Segmentation from motion: combining Gabor- and Mallat-wavelets to overcome aperture and correspondence problem Technical report , IR-INI 96-10 .
     
    techreport Segmentation from motion (1993)
    BibTeX:
    			
                            @techreport{Wiskott-1996b,
                              author       = {Laurenz Wiskott},
                              title        = {Segmentation from motion: combining {G}abor- and {M}allat-wavelets to overcome aperture and correspondence problem},
                              publisher    = {Institut für Neuroinformatik},
                              year         = {1996},
                              volume       = {IR-INI 96-10},
                              howpublished = {Technical report}
                            }
    			
    					
    Wiskott, L. 1996 Phantom faces for face analysis Proc. of the 3rd Joint Symp. on Neural Computation, Jun 1, Pasadena, CA, USA , 6 , 46-52 .
     
    inproceedings Face analysis with EBGM (1993-1995)
    BibTeX:
    			
                            @inproceedings{Wiskott-1996c,
                              author       = {Laurenz Wiskott},
                              title        = {Phantom faces for face analysis},
                              booktitle    = {Proc.\ of the 3rd Joint Symp.\ on Neural Computation, Jun 1, Pasadena, CA, USA},
                              publisher    = {Univ.\ of California},
                              year         = {1996},
                              volume       = {6},
                              pages        = {46--52}
                            }
    			
    					
    Wiskott, L. 1997 Phantom faces for face analysis Pattern Recognition , 30(6) , 837-846 .
     
    article Face analysis with EBGM (1993-1995)
    Abstract: The system presented is part of a general object recognition system. Images of faces are represented as graphs, labeled with topographical information and local features. New graphs of faces are generated by an elastic graph matching procedure comparing the new face with a composition of stored graphs: the face bunch graph. The result of this matching process can be used to generate composite images of faces and to determine facial attributes represented in the bunch graph, such as sex or the presence of glasses or a beard.
    BibTeX:
    			
                            @article{Wiskott-1997a,
                              author       = {Laurenz Wiskott},
                              title        = {Phantom faces for face analysis},
                              journal      = {Pattern Recognition},
                              year         = {1997},
                              volume       = {30},
                              number       = {6},
                              pages        = {837--846},
    			  
    			  url          = {http://www.sciencedirect.com/science/article/pii/S003132039600132X},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/Wiskott-1997a-PatRec-FaceAnalysis-Preprint.pdf},
    			  
                              doi          = {http://doi.org/10.1016/s0031-3203(96)00132-x}
                            }
    			
    					
    Wiskott, L. 1997 Phantom faces for face analysis Proc. IEEE Intl. Conf. on Image Processing (ICIP'97), Santa Barbara, CA, USA , III 308-311 .
     
    inproceedings Face analysis with EBGM (1993-1995)
    BibTeX:
    			
                            @inproceedings{Wiskott-1997d,
                              author       = {Laurenz Wiskott},
                              title        = {Phantom faces for face analysis},
                              booktitle    = {Proc.\ IEEE Intl.\ Conf.\ on Image Processing (ICIP'97), Santa Barbara, CA, USA},
                              publisher    = {IEEE},
                              year         = {1997},
                              pages        = {III 308--311}
                            }
    			
    					
    Wiskott, L. 1998 Learning invariance manifolds Proc. of the 5th Joint Symp. on Neural Computation, May 16, San Diego, CA, USA , 8 , 196-203 .
     
    inproceedings SFA: Learning visual invariances I (1997-1999)
    BibTeX:
    			
                            @inproceedings{Wiskott-1998a,
                              author       = {Laurenz Wiskott},
                              title        = {Learning invariance manifolds},
                              booktitle    = {Proc.\ of the 5th Joint Symp.\ on Neural Computation, May 16, San Diego, CA, USA},
                              publisher    = {Univ.\ of California},
                              year         = {1998},
                              volume       = {8},
                              pages        = {196--203},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/Wiskott-1998a-JSNC-InvarianceManifolds-Preprint.pdf}
                            }
    			
    					
    Wiskott, L. 1999 The role of topographical constraints in face recognition Pattern Recognition Letters , 20(1) , 89-96 .
     
    article Topography in face recognition (1995,1997)
    Abstract: The role of topographical constraints for recognition performance is investigated systematically for the case of face recognition. Images are represented by rectangular graphs labeled with jets, based on a Gabor wavelet transform. Topographical constraints are varied between rigid and no constraints. A comparison with two elastic graph matching algorithms is made. The simple methods presented in this paper and elastic graph matching perform comparably on easy galleries, i.e. different facial expression or 11° rotation in depth. On a 22° gallery, elastic graph matching performs significantly better.
    BibTeX:
    			
                            @article{Wiskott-1999a,
                              author       = {Laurenz Wiskott},
                              title        = {The role of topographical constraints in face recognition},
                              journal      = {Pattern Recognition Letters},
                              year         = {1999},
                              volume       = {20},
                              number       = {1},
                              pages        = {89--96},
    			  
    			  url          = {http://www.sciencedirect.com/science/article/pii/S0167865598001226},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/Wiskott-1999a-PattRecLett-Topography-Preprint.pdf},
    			  
                              doi          = {http://doi.org/10.1016/s0167-8655(98)00122-6}
                            }
    			
    					
    Wiskott, L. 1999 Learning invariance manifolds Proc. Computational Neuroscience Meeting (CNS'98), Santa Barbara, CA, USA .
    (Special issue of Neurocomputing, 26/27:925--932)  
    inproceedings SFA: Learning visual invariances I (1997-1999)
    BibTeX:
    			
                            @inproceedings{Wiskott-1999b,
                              author       = {Laurenz Wiskott},
                              title        = {Learning invariance manifolds},
                              booktitle    = {Proc.\ Computational Neuroscience Meeting (CNS'98), Santa Barbara, CA, USA},
                              year         = {1999},
    			  
    			  url          = {http://linkinghub.elsevier.com/retrieve/pii/S0925231299000119},
    			  
                              doi          = {http://doi.org/10.1016/S0925-2312(99)00011-9}
                            }
    			
    					
    Wiskott, L. 1999 Segmentation from motion: combining Gabor- and Mallat-wavelets to overcome the aperture and correspondence problems Pattern Recognition , 32(10) , 1751-1766 .
     
    article Segmentation from motion (1993)
    Abstract: A new method for segmentation from motion is presented, which is designed to be part of a general object-recognition system. The key idea is to integrate information from Gabor- and Mallat-wavelet transforms of an image sequence to overcome the aperture and the correspondence problem. It is assumed that objects move fronto-parallel. Gabor-wavelet responses allow accurate estimation of image flow vectors with low spatial resolution. A histogram over this image flow field is evaluated and its local maxima provide a set of motion hypotheses. These serve to reduce the correspondence problem occurring in utilizing the Mallat-wavelet transform, which provides the required high spatial resolution in segmentation. Segmentation reliability is improved by integration over time. The system can segment several small, disconnected, and openworked objects, such as dot patterns. Several examples demonstrate the performance of the system and show that the algorithm behaves reasonably well, even if the assumption of fronto-parallel motion is not met.
    BibTeX:
    			
                            @article{Wiskott-1999d,
                              author       = {Laurenz Wiskott},
                              title        = {Segmentation from motion: combining {G}abor- and {M}allat-wavelets to overcome the aperture and correspondence problems},
                              journal      = {Pattern Recognition},
                              year         = {1999},
                              volume       = {32},
                              number       = {10},
                              pages        = {1751--1766},
    			  
    			  url          = {http://www.sciencedirect.com/science/article/pii/S0031320398001794},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/Wiskott-1999d-PattRec-SegmFromMotion-Preprint.pdf},
    			  
                              doi          = {http://doi.org/10.1016/s0031-3203(98)00179-4}
                            }
    			
    					
    Wiskott, L. 2000 Unsupervised learning of invariances in a simple model of the visual system Proc. 9th Annual Computational Neuroscience Meeting (CNS'00), Jul 16-20, Brugge, Belgium , 157 .
     
    inproceedings SFA: Learning visual invariances I (1997-1999)
    BibTeX:
    			
                            @inproceedings{Wiskott-2000,
                              author       = {Laurenz Wiskott},
                              title        = {Unsupervised learning of invariances in a simple model of the visual system},
                              booktitle    = {Proc.\ 9th Annual Computational Neuroscience Meeting (CNS'00), Jul 16--20, Brugge, Belgium},
                              year         = {2000},
                              pages        = {157}
                            }
    			
    					
    Wiskott, L. 2003 Slow Feature Analysis: a theoretical analysis of optimal free responses Neural Computation , 15(9) , 2147-2177 .
     
    article SFA: Theory of free responses (1998-2002)
    Abstract: Temporal slowness is a learning principle that allows learning of invariant representations by extracting slowly varying features from quickly varying input signals. Slow feature analysis (SFA) is an efficient algorithm based on this principle, which has been applied to the learning of translation, scale, and other invariances in a simple model of the visual system. Here a theoretical analysis of the optimization problem solved by SFA is presented, which provides a deeper understanding of the simulation results obtained in previous studies.
    BibTeX:
    			
                            @article{Wiskott-2003a,
                              author       = {Wiskott, Laurenz},
                              title        = {Slow {F}eature {A}nalysis: a theoretical analysis of optimal free responses},
                              journal      = {Neural Computation},
                              year         = {2003},
                              volume       = {15},
                              number       = {9},
                              pages        = {2147--2177},
    			  
    			  url          = {http://www.mitpressjournals.org/doi/abs/10.1162/089976603322297331},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/Wiskott-2003a-NeurComp-SFATheoryFree.pdf},
    			  
                              doi          = {http://doi.org/10.1162/089976603322297331}
                            }
    			
    					
    Wiskott, L. 2003 Estimating driving forces of nonstationary time series with Slow Feature Analysis CoRR e-print arXiv:cond-mat/0312317 .
     
    misc SFA: Estimating driving forces (2000-2003)
    Abstract: Slow feature analysis (SFA) is a new technique for extracting slowly varying features from a quickly varying signal. It is shown here that SFA can be applied to nonstationary time series to estimate a single underlying driving force with high accuracy up to a constant offset and a factor. Examples with a tent map and a logistic map illustrate the performance.
    BibTeX:
    			
                            @misc{Wiskott-2003b,
                              author       = {Laurenz Wiskott},
                              title        = {Estimating driving forces of nonstationary time series with {S}low {F}eature {A}nalysis},
                              journal      = {CoRR},
                              year         = {2003},
                              howpublished = {e-print arXiv:cond-mat/0312317},
    			  
    			  url          = {https://arxiv.org/abs/cond-mat/0312317/},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/Wiskott-2003b-arXiv-SFA-Application.pdf}
                            }
    			
    					
    Wiskott, L. 2003 How does our visual system achieve shift and size invariance? Cognitive Sciences EPrint Archive (CogPrints) , 3321 .
     
    misc Visual invariances: a review (2000)
    Abstract: The question of shift and size invariance in the primate visual system is discussed. After a short review of the relevant neurobiology and psychophysics, a more detailed analysis of computational models is given. The two main types of networks considered are the dynamic routing circuit model and invariant feature networks, such as the neocognitron. Some specific open questions in context of these models are raised and possible solutions discussed.
    BibTeX:
    			
                            @misc{Wiskott-2003c,
                              author       = {Laurenz Wiskott},
                              title        = {How does our visual system achieve shift and size invariance?},
                              year         = {2003},
                              volume       = {3321},
                              howpublished = {Cognitive Sciences EPrint Archive (CogPrints)},
    			  
    			  url          = {http://cogprints.org/3321/},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/Wiskott-2003c-CogPrints-Invariances.pdf}
                            }
    			
    					
    Wiskott, L. 2006 Is slowness a learning principle of visual cortex? Proc. Japan-Germany Symposium on Computational Neuroscience, Feb 1-4, Wako, Saitama, Japan , 25 .
     
    inproceedings SFA: Learning visual invariances I (1997-1999), SFA: Complex cells (2001-2003)
    BibTeX:
    			
                            @inproceedings{Wiskott-2006a,
                              author       = {Laurenz Wiskott},
                              title        = {Is slowness a learning principle of visual cortex?},
                              booktitle    = {Proc.\ Japan-Germany Symposium on Computational Neuroscience, Feb 1-4, Wako, Saitama, Japan},
                              publisher    = {RIKEN Brain Science Institute},
                              year         = {2006},
                              pages        = {25}
                            }
    			
    					
    Wiskott, L. 2019 Mit digitalen Medien die eigene Lehre verändern: Inverted Classroom in der Neuroinformatik https://www.e-teaching.org .
     
    misc
    BibTeX:
    			
                            @misc{Wiskott-2019,
                              author       = {Laurenz Wiskott},
                              title        = {Mit digitalen Medien die eigene Lehre verändern: Inverted Classroom in der Neuroinformatik},
                              year         = {2019},
                              howpublished = {https://www.e-teaching.org},
    			  
    			  url          = {https://www.e-teaching.org/materialien/podcasts/podcast-2019//mit-digitalen-medien-die-eigene-lehre-veraendern-inverted-classroom-in-der-neuroinformatik}
                            }
    			
    					
    Wiskott, L.; Appleby, P.A. & Kempermann, G. 2007 Adult hippocampal neurogenesis - a strategy for avoiding catastrophic interference? Proc. 3rd Annual Computational Cognitive Neuroscience Conference, Nov 1-2, San Diego, CA, USA , 9 .
     
    inproceedings Adult neurogenesis: Function II (2005-2007)
    BibTeX:
    			
                            @inproceedings{WiskottApplebyEtAl-2007b,
                              author       = {Laurenz Wiskott and Peter A. Appleby and Gerd Kempermann},
                              title        = {Adult hippocampal neurogenesis - a strategy for avoiding catastrophic interference?},
                              booktitle    = {Proc.\ 3rd Annual Computational Cognitive Neuroscience Conference, Nov 1--2, San Diego, CA, USA},
                              year         = {2007},
                              pages        = {9}
                            }
    			
    					
    Wiskott, L.; Appleby, P.A. & Kempermann, G. 2007 What is the functional role of adult neurogenesis in the hippocampus? - A computational approach Proc. Adult Neurogenesis Symposium, Oct 15, Dresden, Germany .
     
    inproceedings Adult neurogenesis: Function II (2005-2007)
    BibTeX:
    			
                            @inproceedings{WiskottApplebyEtAl-2007a,
                              author       = {Laurenz Wiskott and Peter A. Appleby and Gerd Kempermann},
                              title        = {What is the functional role of adult neurogenesis in the hippocampus? - {A} computational approach},
                              booktitle    = {Proc.\ Adult Neurogenesis Symposium, Oct 15, Dresden, Germany},
                              publisher    = {Abcam},
                              year         = {2007}
                            }
    			
    					
    Wiskott, L. & Berkes, P. 2002 Is slowness a principle for the emergence of complex cells in primary visual cortex? Proc. Berlin Neuroscience Forum, Apr 18-20, Liebenwalde, Germany , 43 .
     
    inproceedings SFA: Complex cells (2001-2003)
    BibTeX:
    			
                            @inproceedings{WiskottBerkes-2002,
                              author       = {Laurenz Wiskott and Pietro Berkes},
                              title        = {Is slowness a principle for the emergence of complex cells in primary visual cortex?},
                              booktitle    = {Proc.\ Berlin Neuroscience Forum, Apr 18-20, Liebenwalde, Germany},
                              publisher    = {Max-Delbrück-Centrum für Molekulare Medizin (MDC)},
                              year         = {2002},
                              pages        = {43}
                            }
    			
    					
    Wiskott, L. & Berkes, P. 2003 Is slowness a learning principle of the visual cortex? Proc. Jahrestagung der Deutschen Zoologischen Gesellschaft, Jun 9-13, Berlin, Germany .
    (Special issue of Zoology, 106(4):373-382)  
    inproceedings SFA: Learning visual invariances I (1997-1999), SFA: Theory of free responses (1998-2002), SFA: Complex cells (2001-2003)
    BibTeX:
    			
                            @inproceedings{WiskottBerkes-2003,
                              author       = {Laurenz Wiskott and Pietro Berkes},
                              title        = {Is slowness a learning principle of the visual cortex?},
                              booktitle    = {Proc.\ Jahrestagung der Deutschen Zoologischen Gesellschaft, Jun 9--13, Berlin, Germany},
                              year         = {2003},
    			  
                              doi          = {http://doi.org/10.1078/0944-2006-00132}
                            }
    			
    					
    Wiskott, L.; Berkes, P.; Franzius, M.; Sprekeler, H. & Wilbert, N. 2011 Slow feature analysis Scholarpedia , 6(4) , 5282 .
     
    article SFA: Learning visual invariances I (1997-1999), SFA: Theory of free responses (1998-2002), SFA: Estimating driving forces (2000-2003), SFA: Complex cells (2001-2003), SFA: Place cells I (2003-2007), SFA: Theory of complex cells (2004-2007), Extended slow feature analysis (xSFA) (2006-2013), SFA: Learning visual invariances II (2006-2009)
    BibTeX:
    			
                            @article{WiskottBerkesEtAl-2011,
                              author       = {Wiskott, L. and Berkes, P. and Franzius, M. and Sprekeler, H. and Wilbert, N.},
                              title        = {Slow feature analysis},
                              journal      = {Scholarpedia},
                              year         = {2011},
                              volume       = {6},
                              number       = {4},
                              pages        = {5282},
    			  
    			  url          = {http://www.scholarpedia.org/article/Slow_feature_analysis},
    			  
                              doi          = {http://doi.org/10.4249/scholarpedia.5282}
                            }
    			
    					
    Wiskott, L.; Fellous, J.-M.; Krüger, N. & von der Malsburg, C. 1995 Face recognition and gender determination Proc. Intl. Workshop on Automatic Face- and Gesture-Recognition (IWAFGR'95), Zurich, Switzerland , 92-97 .
     
    inproceedings Face recognition with EBGM (1993,1994), Face analysis with EBGM (1993-1995)
    BibTeX:
    			
                            @inproceedings{WiskottFellousEtAl-1995,
                              author       = {Laurenz Wiskott and Jean-Marc Fellous and Norbert Krüger and Christoph von der Malsburg},
                              title        = {Face recognition and gender determination},
                              booktitle    = {Proc.\ Intl.\ Workshop on Automatic Face- and Gesture-Recognition (IWAFGR'95), Zurich, Switzerland},
                              publisher    = {MultiMedia Laboratory, University of Zurich},
                              year         = {1995},
                              pages        = {92--97},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/WiskottFellousEtAl-1995-ProcIWAFGR-FaceRecognitionAnalysis.pdf}
                            }
    			
    					
    Wiskott, L.; Fellous, J.-M.; Krüger, N. & von der Malsburg, C. 1999 Face recognition by Elastic Bunch Graph Matching Chapter 11 in Intelligent Biometric Techniques in Fingerprint and Face Recognition , 355-396 .
     
    incollection Face recognition with EBGM (1993,1994)
    Abstract: We present a system for recognizing human faces from single images out of a large database containing one image per person. The task is difficult because of image variation in terms of position, size, expression, and pose. The system collapses most of this variance by extracting concise face descriptions in the form of image graphs. In these, fiducial points on the face (eyes, mouth, etc.) are described by sets of wavelet components (jets). Image graph extraction is based on a novel approach, the bunch graph, which is constructed from a small set of sample image graphs. Recognition is based on a straightforward comparison of image graphs. We report recognition experiments on the FERET database as well as the Bochum database, including recognition across pose.
    BibTeX:
    			
                            @incollection{WiskottFellousEtAl-1999,
                              author       = {Laurenz Wiskott and Jean-Marc Fellous and Norbert Krüger and Christoph von der Malsburg},
                              title        = {Face recognition by {E}lastic {B}unch {G}raph {M}atching},
                              booktitle    = {Intelligent Biometric Techniques in Fingerprint and Face Recognition},
                              publisher    = {CRC Press},
                              year         = {1999},
                              pages        = {355--396},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/WiskottFellousEtAl-1999-JainBook-FaceRecognition-Preprint.pdf}
                            }
    			
    					
    Wiskott, L.; Fellous, J.-M.; Krüger, N. & von der Malsburg, C. 1997 Face recognition by Elastic Bunch Graph Matching Proc. 7th Intl. Conf. on Computer Analysis of Images and Patterns (CAIP'97), Kiel, Germany , Lecture Notes in Computer Science , 1296 , 456-463 .
     
    inproceedings Face recognition with EBGM (1993,1994)
    BibTeX:
    			
                            @inproceedings{WiskottFellousEtAl-1997b,
                              author       = {Laurenz Wiskott and Jean-Marc Fellous and Norbert Krüger and Christoph von der Malsburg},
                              title        = {Face recognition by {E}lastic {B}unch {G}raph {M}atching},
                              booktitle    = {Proc.\ 7th Intl.\ Conf.\ on Computer Analysis of Images and Patterns (CAIP'97), Kiel, Germany},
                              publisher    = {Springer-Verlag},
                              year         = {1997},
                              volume       = {1296},
                              pages        = {456--463},
    			  
    			  url          = {https://link.springer.com/chapter/10.1007/3-540-63460-6_150},
    			  
                              doi          = {http://doi.org/10.1007/3-540-63460-6_150}
                            }
    			
    					
    Wiskott, L.; Fellous, J.-M.; Krüger, N. & von der Malsburg, C. 1996 Face recognition by Elastic Bunch Graph Matching Technical report , IR-INI 96-08 .
     
    techreport Face recognition with EBGM (1993,1994)
    BibTeX:
    			
                            @techreport{WiskottFellousEtAl-1996,
                              author       = {Laurenz Wiskott and Jean-Marc Fellous and Norbert Krüger and Christoph von der Malsburg},
                              title        = {Face recognition by {E}lastic {B}unch {G}raph {M}atching},
                              publisher    = {Institut für Neuroinformatik},
                              year         = {1996},
                              volume       = {IR-INI 96-08},
                              howpublished = {Technical report}
                            }
    			
    					
    Wiskott, L.; Fellous, J.-M.; Krüger, N. & von der Malsburg, C. 1997 Face recognition by Elastic Bunch Graph Matching IEEE Trans. on Pattern Analysis and Machine Intelligence , 19(7) , 775-779 .
     
    article Face recognition with EBGM (1993,1994)
    Abstract: We present a system for recognizing human faces from single images out of a large database containing one image per person. Faces are represented by labeled graphs, based on a Gabor wavelet transform. Image graphs of new faces are extracted by an elastic graph matching process and can be compared by a simple similarity function. The system differs from the preceding one in three respects. Phase information is used for accurate node positioning. Object-adapted graphs are used to handle large rotations in depth. Image graph extraction is based on a novel data structure, the bunch graph, which is constructed from a small set of sample image graphs.
    BibTeX:
    			
                            @article{WiskottFellousEtAl-1997a,
                              author       = {Laurenz Wiskott and Jean-Marc Fellous and Norbert Krüger and Christoph von der Malsburg},
                              title        = {Face recognition by {E}lastic {B}unch {G}raph {M}atching},
                              journal      = {IEEE Trans.\ on Pattern Analysis and Machine Intelligence},
                              year         = {1997},
                              volume       = {19},
                              number       = {7},
                              pages        = {775--779},
    			  
    			  url          = {http://doi.ieeecomputersociety.org/10.1109/34.598235},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/WiskottFellousEtAl-1997a-PAMI-FaceRecognition-Preprint.pdf},
    			  
                              doi          = {http://doi.org/10.1109/34.598235}
                            }
    			
    					
    Wiskott, L.; Fellous, J.-M.; Krüger, N. & von der Malsburg, C. 1997 Face recognition by Elastic Bunch Graph Matching Proc. IEEE Intl. Conf. on Image Processing (ICIP'97), Santa Barbara, CA, USA , I 129-132 .
     
    inproceedings Face recognition with EBGM (1993,1994)
    BibTeX:
    			
                            @inproceedings{WiskottFellousEtAl-1997c,
                              author       = {Laurenz Wiskott and Jean-Marc Fellous and Norbert Krüger and Christoph von der Malsburg},
                              title        = {Face recognition by {E}lastic {B}unch {G}raph {M}atching},
                              booktitle    = {Proc.\ IEEE Intl.\ Conf.\ on Image Processing (ICIP'97), Santa Barbara, CA, USA},
                              publisher    = {IEEE},
                              year         = {1997},
                              pages        = {I 129--132}
                            }
    			
    					
    Wiskott, L.; Franzius, M.; Berkes, P. & Sprekeler, H. 2007 Is slowness a learning principle of the visual system? Proc. 39th European Brain and Behaviour Society Meeting (EBBS), Sep 15-19, Triest, Italy , 14-15 .
    (Special issue of Neural Plasticity, Article ID 23250)  
    inproceedings SFA: Complex cells (2001-2003), SFA: Theory of complex cells (2004-2007), SFA: Learning visual invariances II (2006-2009)
    BibTeX:
    			
                            @inproceedings{WiskottFranziusEtAl-2007,
                              author       = {Laurenz Wiskott and Mathias Franzius and Pietro Berkes and Henning Sprekeler},
                              title        = {Is slowness a learning principle of the visual system?},
                              booktitle    = {Proc.\ 39th European Brain and Behaviour Society Meeting (EBBS), Sep 15--19, Triest, Italy},
                              year         = {2007},
                              pages        = {14--15}
                            }
    			
    					
    Wiskott, L.; Franzius, M.; Sprekeler, H. & Appleby, P. 2009 Self-organization of place cells with slowness, sparseness, and neurogenesis Proc. 41st European Brain and Behaviour Society Meeting (EBBS), Sep 13-18, Rhodes Island, Greece .
    (Special issue of Frontiers in Behavioral Neuroscience, doi: 10.3389/conf.neuro.08.2009.09.062)  
    inproceedings SFA: Place cells I (2003-2007), Adult neurogenesis: Function III (2007-2010)
    BibTeX:
    			
                            @inproceedings{WiskottFranziusEtAl-2009,
                              author       = {Laurenz Wiskott and Mathias Franzius and Henning Sprekeler and Peter Appleby},
                              title        = {Self-organization of place cells with slowness, sparseness, and neurogenesis},
                              booktitle    = {Proc.\ 41st European Brain and Behaviour Society Meeting (EBBS), Sep 13--18, Rhodes Island, Greece},
                              year         = {2009}
                            }
    			
    					
    Wiskott, L. & von der Malsburg, C. 1995 Face recognition by Dynamic Link Matching Proc. Intl. Conf. on Artificial Neural Networks (ICANN'95), Paris, France , 347-352 .
     
    inproceedings Face recognition with DLM (1993-1995)
    BibTeX:
    			
                            @inproceedings{WiskottMalsburg-1995,
                              author       = {Laurenz Wiskott and Christoph von der Malsburg},
                              title        = {Face recognition by {D}ynamic {L}ink {M}atching},
                              booktitle    = {Proc.\ Intl.\ Conf.\ on Artificial Neural Networks (ICANN'95), Paris, France},
                              publisher    = {EC2 \& Cie},
                              year         = {1995},
                              pages        = {347--352}
                            }
    			
    					
    Wiskott, L. & von der Malsburg, C. 1994 A neural system for the recognition of partially occluded objects in cluttered scenes: a pilot study Advances in Pattern Recognition Systems using Neural Networks Technologies , Machine Perception and Artificial Intelligence , 7 .
     
    incollection Scene analysis (1992)
    BibTeX:
    			
                            @incollection{WiskottMalsburg-1994a,
                              author       = {Laurenz Wiskott and Christoph von der Malsburg},
                              title        = {A neural system for the recognition of partially occluded objects in cluttered scenes: a pilot study},
                              booktitle    = {Advances in Pattern Recognition Systems using Neural Networks Technologies},
                              publisher    = {World scientific},
                              year         = {1994},
                              volume       = {7}
                            }
    			
    					
    Wiskott, L. & von der Malsburg, C. 1994 Object recognition with Dynamic Link Matching Neural Computing , Dagstuhl-Seminar-Report , 103 , 20-21 .
     
    inproceedings Face recognition with DLM (1993-1995)
    BibTeX:
    			
                            @inproceedings{WiskottMalsburg-1994b,
                              author       = {Laurenz Wiskott and Christoph von der Malsburg},
                              title        = {Object recognition with {D}ynamic {L}ink {M}atching},
                              booktitle    = {Neural Computing},
                              publisher    = {Schloss Dagstuhl},
                              year         = {1994},
                              volume       = {103},
                              pages        = {20--21}
                            }
    			
    					
    Wiskott, L. & von der Malsburg, C. 1999 Objekterkennung in einem selbstorganisierenden neuronalen System Komplexe Systeme und Nichtlineare Dynamik in Natur und Gesellschaft , 169-188 .
     
    incollection Face recognition with DLM (1993-1995)
    BibTeX:
    			
                            @incollection{WiskottMalsburg-1999,
                              author       = {Laurenz Wiskott and Christoph von der Malsburg},
                              title        = {Objekterkennung in einem selbstorganisierenden neuronalen {S}ystem},
                              booktitle    = {Komplexe Systeme und Nichtlineare Dynamik in Natur und Gesellschaft},
                              publisher    = {Springer-Verlag},
                              year         = {1999},
                              pages        = {169--188},
    			  
    			  url          = {https://link.springer.com/chapter/10.1007/978-3-642-60063-0_10},
    			  
                              doi          = {http://doi.org/10.1007/978-3-642-60063-0_10}
                            }
    			
    					
    Wiskott, L. & von der Malsburg, C. 1996 Face recognition by Dynamic Link Matching Chapter 11 in Lateral Interactions in the Cortex: Structure and Function .
    (Electronic book, ISBN 0-9647060-0-8)  
    incollection Face recognition with DLM (1993-1995)
    Abstract: We present a neural system for the recognition of objects from realistic images, together with results of tests of face recognition from a large gallery. The system is inherently invariant with respect to shift, and is robust against many other variations, most notably rotation in depth and deformation. The system is based on Dynamic Link Matching. It consists of an image domain and a model domain, which we tentatively identify with primary visual cortex and infero-temporal cortex. Both domains have the form of neural sheets of hypercolumns, which are composed of simple feature detectors (modeled as Gabor-based wavelets). Each object is represented in memory by a separate model sheet, that is, a two-dimensional array of features. The match of the image to the models is performed by network self-organization, in which rapid reversible synaptic plasticity of the connections ('dynamic links') between the two domains is controlled by signal correlations, which are shaped by fixed inter-columnar connections and by the dynamic links themselves. The system requires very little genetic or learned structure, relying essentially on the rules of rapid synaptic plasticity and the a priori constraint of preservation of topography to find matches. This constraint is encoded within the neural sheets with the help of lateral connections, which are excitatory over short range and inhibitory over long range.
    BibTeX:
    			
                            @incollection{WiskottMalsburg-1996d,
                              author       = {Laurenz Wiskott and Christoph von der Malsburg},
                              title        = {Face recognition by {D}ynamic {L}ink {M}atching},
                              booktitle    = {Lateral Interactions in the Cortex: Structure and Function},
                              publisher    = {The UTCS Neural Networks Research Group, Austin, TX},
                              year         = {1996},
    			  
    			  url          = {http://www.cs.utexas.edu/users/nn/web-pubs/htmlbook96/},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/WiskottMalsburg-1996d-WWWBook-FaceRecognitionDLM-Preprint.pdf}
                            }
    			
    					
    Wiskott, L. & von der Malsburg, C. 1996 Recognizing faces by Dynamic Link Matching Symposium über biologische Informationsverarbeitung und Neuronale Netze (SINN'95), Germany , Beiträge zur wissenschaftlichen Diskussion , 16 , 63-68 .
     
    inproceedings Face recognition with DLM (1993-1995)
    BibTeX:
    			
                            @inproceedings{WiskottMalsburg-1996c,
                              author       = {Laurenz Wiskott and Christoph von der Malsburg},
                              title        = {Recognizing faces by {D}ynamic {L}ink {M}atching},
                              booktitle    = {Symposium über biologische Informationsverarbeitung und Neuronale Netze (SINN'95), Germany},
                              publisher    = {Hanns-Seidel-Stiftung},
                              year         = {1996},
                              volume       = {16},
                              pages        = {63--68}
                            }
    			
    					
    Wiskott, L. & von der Malsburg, C. 1993 A neural system for the recognition of partially occluded objects in cluttered scenes: a pilot study Intl. J. of Pattern Recognition and Artificial Intelligence , 7(4) , 935-948 .
     
    article Scene analysis (1992)
    Abstract: We present a system for the interpretation of camera images of scenes composed of several known objects with mutual occlusion. The scenes are analyzed by the recognition of the objects present and by the determination of their occlusion relations. Objects are internally represented by stored model graphs. These are formed in a semi-automatic way by showing objects against a varying background. Objects are recognized by dynamic link matching. Our experiments show that our system is very successful in analyzing cluttered scenes. The system architecture goes beyond classical neural networks by making extensive use of flexible links between units, as proposed in the dynamic link architecture. The present implementation is, however, rather algorithmic in style and is to be regarded as a pilot study that is preparing the way for a detailed implementation of the architecture.
    BibTeX:
    			
                            @article{WiskottMalsburg-1993,
                              author       = {Laurenz Wiskott and Christoph von der Malsburg},
                              title        = {A neural system for the recognition of partially occluded objects in cluttered scenes: a pilot study},
                              journal      = {Intl.\ J. of Pattern Recognition and Artificial Intelligence},
                              year         = {1993},
                              volume       = {7},
                              number       = {4},
                              pages        = {935--948},
    			  
    			  url          = {http://www.worldscientific.com/doi/abs/10.1142/S0218001493000479},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/WiskottMalsburg-1993-IJPRAI-SceneAnalysis-Preprint.pdf},
    			  
                              doi          = {http://doi.org/10.1142/S0218001493000479}
                            }
    			
    					
    Wiskott, L. & von der Malsburg, C. 1996 Face recognition by Dynamic Link Matching Technical report , IR-INI 96-05 .
     
    techreport Face recognition with DLM (1993-1995)
    BibTeX:
    			
                            @techreport{WiskottMalsburg-1996a,
                              author       = {Laurenz Wiskott and Christoph von der Malsburg},
                              title        = {Face recognition by {D}ynamic {L}ink {M}atching},
                              publisher    = {Institut für Neuroinformatik},
                              year         = {1996},
                              volume       = {IR-INI 96-05},
                              howpublished = {Technical report},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/WiskottMalsburg-1996a-IRINI9605-FaceRecognitionDLM.pdf}
                            }
    			
    					
    Wiskott, L. & von der Malsburg, C. 1996 Recognizing faces by Dynamic Link Matching Proc. US-EC Workshop on Neuroinformatics 1995, Washington DC, USA .
    (Special issue of NeuroImage, 4(3):S14--S18)  
    inproceedings Face recognition with DLM (1993-1995)
    BibTeX:
    			
                            @inproceedings{WiskottMalsburg-1996b,
                              author       = {Laurenz Wiskott and Christoph von der Malsburg},
                              title        = {Recognizing faces by {D}ynamic {L}ink {M}atching},
                              booktitle    = {Proc.\ US-EC Workshop on Neuroinformatics 1995, Washington DC, USA},
                              year         = {1996},
    			  
    			  url          = {http://www.sciencedirect.com/science/article/pii/S1053811996900439},
    			  
                              doi          = {http://doi.org/10.1006/nimg.1996.0043}
                            }
    			
    					
    Wiskott, L. & von der Malsburg, C. 2001 Labeled bunch graphs for image analysis United States Patent , 6,222,939 .
     
    misc Face recognition with EBGM (1993,1994)
    BibTeX:
    			
                            @misc{WiskottMalsburg-2001,
                              author       = {Laurenz Wiskott and Christoph von der Malsburg},
                              title        = {Labeled bunch graphs for image analysis},
                              year         = {2001},
                              volume       = {6,222,939},
                              howpublished = {United States Patent}
                            }
    			
    					
    Wiskott, L. & von der Malsburg, C. 2002 Labeled bunch graphs for image analysis United States Patent , 6,356,659 .
     
    misc Face recognition with EBGM (1993,1994)
    BibTeX:
    			
                            @misc{WiskottMalsburg-2002,
                              author       = {Laurenz Wiskott and Christoph von der Malsburg},
                              title        = {Labeled bunch graphs for image analysis},
                              year         = {2002},
                              volume       = {6,356,659},
                              howpublished = {United States Patent}
                            }
    			
    					
    Wiskott, L. & von der Malsburg, C. 2003 Labeled bunch graphs for image analysis United States Patent , 6,563,950 .
     
    misc Face recognition with EBGM (1993,1994)
    BibTeX:
    			
                            @misc{WiskottMalsburg-2003,
                              author       = {Laurenz Wiskott and Christoph von der Malsburg},
                              title        = {Labeled bunch graphs for image analysis},
                              year         = {2003},
                              volume       = {6,563,950},
                              howpublished = {United States Patent}
                            }
    			
    					
    Wiskott, L.; von der Malsburg, C. & Weitzenfeld, A. 2002 Face recognition by Dynamic Link Matching Chapter 18 in The Neural Simulation Language: A System for Brain Modeling , 343-372 .
     
    incollection Face recognition with DLM (1993-1995)
    BibTeX:
    			
                            @incollection{WiskottMalsburgEtAl-2002,
                              author       = {Laurenz Wiskott and Christoph von der Malsburg and Alfredo Weitzenfeld},
                              title        = {Face recognition by {D}ynamic {L}ink {M}atching},
                              booktitle    = {The Neural Simulation Language: A System for Brain Modeling},
                              publisher    = {MIT Press},
                              year         = {2002},
                              pages        = {343--372}
                            }
    			
    					
    Wiskott, L.; Quang, M.H.; Sprekeler, H. & Zito, T. 2010 Slow Feature Analysis: analyzing signals with the slowness principle Proc. 2nd joint Statistical Meeting Deutsche Arbeitsgemeinschaft Statistik (DAGStat'10), Mar 23-26, Dortmund, Germany , 398 .
     
    inproceedings SFA: Estimating driving forces (2000-2003), Extended slow feature analysis (xSFA) (2006-2013)
    BibTeX:
    			
                            @inproceedings{WiskottQuangEtAl-2010,
                              author       = {Laurenz Wiskott and Minh Ha Quang and Henning Sprekeler and Tiziano Zito},
                              title        = {Slow {F}eature {A}nalysis: analyzing signals with the slowness principle},
                              booktitle    = {Proc.\ 2nd joint Statistical Meeting Deutsche Arbeitsgemeinschaft Statistik (DAGStat'10), Mar 23--26, Dortmund, Germany},
                              publisher    = {Technische Universität Dortmund},
                              year         = {2010},
                              pages        = {398}
                            }
    			
    					
    Wiskott, L.; Rasch, M. & Kempermann, G. 2004 What is the functional role of adult neurogenesis in the hippocampus? Cognitive Sciences EPrint Archive (CogPrints) , 4012 .
     
    misc Adult neurogenesis: Function I (2000-2003)
    BibTeX:
    			
                            @misc{WiskottRaschEtAl-2004,
                              author       = {Laurenz Wiskott and Malte Rasch and Gerd Kempermann},
                              title        = {What is the functional role of adult neurogenesis in the hippocampus?},
                              year         = {2004},
                              volume       = {4012},
                              howpublished = {Cognitive Sciences EPrint Archive (CogPrints)},
    			  
    			  url          = {http://cogprints.org/4012/},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/WiskottRaschEtAl-2004-CogPrints-Neurogenesis.pdf}
                            }
    			
    					
    Wiskott, L.; Rasch, M. & Kempermann, G. 2005 What is the functional role of adult neurogenesis in the hippocampus? Proc. Computational and Systems Neuroscience (COSYNE'05), Salk Lake City, USA .
     
    inproceedings Adult neurogenesis: Function I (2000-2003)
    BibTeX:
    			
                            @inproceedings{WiskottRaschEtAl-2005,
                              author       = {Laurenz Wiskott and Malte Rasch and Gerd Kempermann},
                              title        = {What is the functional role of adult neurogenesis in the hippocampus?},
                              booktitle    = {Proc.\ Computational and Systems Neuroscience (COSYNE'05), Salk Lake City, USA},
                              year         = {2005}
                            }
    			
    					
    Wiskott, L.; Rasch, M. & Kempermann, G. 2006 A functional hypothesis for adult hippocampal neurogenesis: avoidance of catastrophic interference in the dentate gyrus Hippocampus , 16(3) , 329-343 .
     
    article Adult neurogenesis: Function I (2000-2003)
    Abstract: The dentate gyrus is part of the hippocampal memory system and special in that it generates new neurons throughout life. Here we discuss the question of what the functional role of these new neurons might be. Our hypothesis is that they help the dentate gyrus to avoid the problem of catastrophic interference when adapting to new environments. We assume that old neurons are rather stable and preserve an optimal encoding learned for known environments while new neurons are plastic to adapt to those features that are qualitatively new in a new environment. A simple network simulation demonstrates that adding new plastic neurons is indeed a successful strategy for adaptation without catastrophic interference.
    BibTeX:
    			
                            @article{WiskottRaschEtAl-2006,
                              author       = {Laurenz Wiskott and Malte Rasch and Gerd Kempermann},
                              title        = {A functional hypothesis for adult hippocampal neurogenesis: avoidance of catastrophic interference in the dentate gyrus},
                              journal      = {Hippocampus},
                              year         = {2006},
                              volume       = {16},
                              number       = {3},
                              pages        = {329--343},
    			  
    			  url          = {http://onlinelibrary.wiley.com/doi/10.1002/hipo.20167/abstract},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/WiskottRaschEtAl-2006-Hippocampus-Neurogenesis-Preprint.pdf},
    			  
                              doi          = {http://doi.org/10.1002/hipo.20167}
                            }
    			
    					
    Wiskott, L.; Rasch, M.J. & Kempermann, G. 2007 What is the functional role of adult neurogenesis in the hippocampus? Kognitionsforschung 2007, Beiträge zur 8. Jahrestagung der Gesellschaft für Kognitionswissenschaft (KogWis'07), Mar 19-21, Saarbrücken, Germany , 53 .
     
    inproceedings Adult neurogenesis: Function I (2000-2003)
    BibTeX:
    			
                            @inproceedings{WiskottRaschEtAl-2007,
                              author       = {Laurenz Wiskott and Malte J. Rasch and Gerd Kempermann},
                              title        = {What is the functional role of adult neurogenesis in the hippocampus?},
                              booktitle    = {Kognitionsforschung 2007, Beiträge zur 8. Jahrestagung der Gesellschaft für Kognitionswissenschaft (KogWis'07), Mar 19--21, Saarbrücken, Germany},
                              publisher    = {Shaker Verlag},
                              year         = {2007},
                              pages        = {53}
                            }
    			
    					
    Wiskott, L. & Schönfeld, F. 2020 Laplacian Matrix for Dimensionality Reduction and Clustering Big Data Management and Analytics , 93-119 .
     
    inproceedings
    Abstract: Many problems in machine learning can be expressed by means of a graph with nodes representing training samples and edges representing the relationship between samples in terms of similarity, temporal proximity, or label information. Graphs can in turn be represented by matrices. A special example is the Laplacian matrix, which allows us to assign each node a value that varies only little between strongly connected nodes and more between distant nodes. Such an assignment can be used to extract a useful feature representation, find a good embedding of data in a low dimensional space, or perform clustering on the original samples. In these lecture notes we first introduce the Laplacian matrix and then present a small number of algorithms designed around it for data visualization and feature extraction.
    BibTeX:
    			
                            @inproceedings{WiskottSchoenfeld-2020,
                              author       = {Wiskott, Laurenz and Sch{\"o}nfeld, Fabian},
                              title        = {Laplacian Matrix for Dimensionality Reduction and Clustering},
                              booktitle    = {Big Data Management and Analytics},
                              publisher    = {Springer International Publishing},
                              year         = {2020},
                              pages        = {93--119}
                            }
    			
    					
    Wiskott, L. & Schönfeld, F. 2019 Laplacian Matrix for Dimensionality Reduction and Clustering - Lecture Notes CoRR e-print arXiv:1909.08381 .
     
    misc
    BibTeX:
    			
                            @misc{WiskottSchoenfeld-2019,
                              author       = {Wiskott, Laurenz and Schönfeld, Fabian},
                              title        = {Laplacian Matrix for Dimensionality Reduction and Clustering - Lecture Notes},
                              journal      = {CoRR},
                              year         = {2019},
                              howpublished = {e-print arXiv:1909.08381},
    			  
    			  url          = {https://arxiv.org/abs/1909.08381}
                            }
    			
    					
    Wiskott, L. & Sejnowski, T. 1997 Objective functions for neural map formation Proc. 7th Intl. Conf. on Artificial Neural Networks (ICANN'97), Lausanne, Switzerland , Lecture Notes in Computer Science , 1327 , 243-248 .
     
    inproceedings Neural map formation (1996,1997)
    BibTeX:
    			
                            @inproceedings{WiskottSejnowski-1997c,
                              author       = {Laurenz Wiskott and Terrence Sejnowski},
                              title        = {Objective functions for neural map formation},
                              booktitle    = {Proc.\ 7th Intl.\ Conf.\ on Artificial Neural Networks (ICANN'97), Lausanne, Switzerland},
                              publisher    = {Springer-Verlag},
                              year         = {1997},
                              volume       = {1327},
                              pages        = {243--248},
    			  
    			  url          = {https://link.springer.com/chapter/10.1007/BFb0020163},
    			  
                              doi          = {http://doi.org/10.1007/bfb0020163}
                            }
    			
    					
    Wiskott, L. & Sejnowski, T. 2001 Constrained optimization for neural map formation: a unifying framework for weight growth and normalization Self-organizing map formation: Foundations of neural computation. , 83-128 .
     
    incollection Neural map formation (1996,1997)
    BibTeX:
    			
                            @incollection{WiskottSejnowski-2001,
                              author       = {Laurenz Wiskott and Terrence Sejnowski},
                              title        = {Constrained optimization for neural map formation: a unifying framework for weight growth and normalization},
                              booktitle    = {Self-organizing map formation: Foundations of neural computation.},
                              publisher    = {MIT Press},
                              year         = {2001},
                              pages        = {83--128}
                            }
    			
    					
    Wiskott, L. & Sejnowski, T. 1997 Objective functions for neural map formation Technical report , INC-9701 .
     
    techreport Neural map formation (1996,1997)
    BibTeX:
    			
                            @techreport{WiskottSejnowski-1997a,
                              author       = {Laurenz Wiskott and Terrence Sejnowski},
                              title        = {Objective functions for neural map formation},
                              publisher    = {Institute for Neural Computation},
                              year         = {1997},
                              volume       = {INC-9701},
                              howpublished = {Technical report}
                            }
    			
    					
    Wiskott, L. & Sejnowski, T. 1997 Objective functions for neural map formation Proc. of the 4th Joint Symp. on Neural Computation, May 17, Los Angeles, CA, USA , 7 , 242-248 .
     
    inproceedings Neural map formation (1996,1997)
    BibTeX:
    			
                            @inproceedings{WiskottSejnowski-1997b,
                              author       = {Laurenz Wiskott and Terrence Sejnowski},
                              title        = {Objective functions for neural map formation},
                              booktitle    = {Proc.\ of the 4th Joint Symp.\ on Neural Computation, May 17, Los Angeles, CA, USA},
                              publisher    = {Univ.\ of California},
                              year         = {1997},
                              volume       = {7},
                              pages        = {242--248}
                            }
    			
    					
    Wiskott, L. & Sejnowski, T. 1998 Constrained optimization for neural map formation: a unifying framework for weight growth and normalization Neural Computation , 10(3) , 671-716 .
     
    article Neural map formation (1996,1997)
    Abstract: Computational models of neural map formation can be considered on at least three different levels of abstraction: detailed models including neural activity dynamics, weight dynamics that abstract from the neural activity dynamics by an adiabatic approximation, and constrained optimization from which equations governing weight dynamics can be derived. Constrained optimization uses an objective function, from which a weight growth rule can be derived as a gradient flow, and some constraints, from which normalization rules are derived. In this paper we present an example of how an optimization problem can be derived from detailed non-linear neural dynamics. A systematic investigation reveals how different weight dynamics introduced previously can be derived from two types of objective function terms and two types of constraints. This includes dynamic link matching as a special case of neural map formation. We focus in particular on the role of coordinate transformations to derive different weight dynamics from the same optimization problem. Several examples illustrate how the constrained optimization framework can help in understanding, generating, and comparing different models of neural map formation. The techniques used in this analysis may also be useful in investigating other types of neural dynamics.
    BibTeX:
    			
                            @article{WiskottSejnowski-1998,
                              author       = {Laurenz Wiskott and Terrence Sejnowski},
                              title        = {Constrained optimization for neural map formation: a unifying framework for weight growth and normalization},
                              journal      = {Neural Computation},
                              year         = {1998},
                              volume       = {10},
                              number       = {3},
                              pages        = {671--716},
    			  
    			  url          = {http://www.mitpressjournals.org/doi/abs/10.1162/089976698300017700},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/WiskottSejnowski-1998-NeurComp-NeuralMapFormation.pdf},
    			  
                              doi          = {http://doi.org/10.1162/089976698300017700}
                            }
    			
    					
    Wiskott, L. & Sejnowski, T. 2002 Slow Feature Analysis: unsupervised learning of invariances Neural Computation , 14(4) , 715-770 .
     
    article SFA: Learning visual invariances I (1997-1999)
    Abstract: Invariant features of temporally varying signals are useful for analysis and classification. Slow feature analysis (SFA) is a new method for learning invariant or slowly varying features from a vectorial input signal. SFA is based on a non-linear expansion of the input signal and application of principal component analysis to this expanded signal and its time derivative. It is guaranteed to find the optimal solution within a family of functions directly and can learn to extract a large number of decorrelated features, which are ordered by their degree of invariance. SFA can be applied hierarchically to process high dimensional input signals and to extract complex features. Slow feature analysis is applied first to complex cell tuning properties based on simple cell output including disparity and motion. Then, more complicated input-output functions are learned by repeated application of SFA. Finally, a hierarchical network of SFA-modules is presented as a simple model of the visual system. The same unstructured network can learn translation, size, rotation, contrast, or, to a lesser degree, illumination invariance for one-dimensional objects, depending only on the training stimulus. Surprisingly, only a few training objects sufficed to achieve good generalization to new objects. The generated representation is suitable for object recognition. Performance degrades, if the network is trained to learn multiple invariances simultaneously.
    BibTeX:
    			
                            @article{WiskottSejnowski-2002,
                              author       = {Laurenz Wiskott and Terrence Sejnowski},
                              title        = {Slow {F}eature {A}nalysis: unsupervised learning of invariances},
                              journal      = {Neural Computation},
                              year         = {2002},
                              volume       = {14},
                              number       = {4},
                              pages        = {715--770},
    			  
    			  url          = {http://dx.doi.org/10.1162/089976602317318938},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/WiskottSejnowski-2002-NeurComp-LearningInvariances.pdf},
    			  
                              doi          = {http://doi.org/10.1162/089976602317318938}
                            }
    			
    					
    Wiskott, L.; Sprekeler, H. & Berkes, P. 2007 Towards an analytical derivation of complex cell receptive field properties Proc. 7th Göttingen Meeting of the German Neuroscience Society, Mar 29 - Apr 1, Göttingen, Germany , S12-2 .
     
    inproceedings SFA: Complex cells (2001-2003), SFA: Theory of complex cells (2004-2007)
    BibTeX:
    			
                            @inproceedings{WiskottSprekelerEtAl-2007,
                              author       = {Laurenz Wiskott and Henning Sprekeler and Pietro Berkes},
                              title        = {Towards an analytical derivation of complex cell receptive field properties},
                              booktitle    = {Proc.\ 7th Göttingen Meeting of the German Neuroscience Society, Mar 29 -- Apr 1, Göttingen, Germany},
                              year         = {2007},
                              pages        = {S12--2}
                            }
    			
    					
    Wiskott, L.; Würtz, R. & Westphal, G. 2014 Elastic Bunch Graph Matching Scholarpedia , 9(3) , 10587 .
     
    article Face recognition with EBGM (1993,1994)
    BibTeX:
    			
                            @article{WiskottWuertzEtAl-2014,
                              author       = {Wiskott, L. and Würtz, R. and Westphal, G.},
                              title        = {Elastic {B}unch {G}raph {M}atching},
                              journal      = {Scholarpedia},
                              year         = {2014},
                              volume       = {9},
                              number       = {3},
                              pages        = {10587},
    			  
    			  url          = {http://www.scholarpedia.org/article/Elastic_Bunch_Graph_Matching},
    			  
                              doi          = {http://doi.org/10.4249/scholarpedia.10587}
                            }
    			
    					
    Zeng, X.; Diekmann, N.; Wiskott, L. & Cheng, S. 2023 Modeling the function of episodic memory in spatial learning Frontiers in Psychology , 14 .
     
    article
    Abstract: Episodic memory has been studied extensively in the past few decades, but so far little is understood about how it drives future behavior. Here we propose that episodic memory can facilitate learning in two fundamentally different modes: retrieval and replay, which is the reinstatement of hippocampal activity patterns during later sleep or awake quiescence. We study their properties by comparing three learning paradigms using computational modeling based on visually-driven reinforcement learning. Firstly, episodic memories are retrieved to learn from single experiences (one-shot learning); secondly, episodic memories are replayed to facilitate learning of statistical regularities (replay learning); and, thirdly, learning occurs online as experiences arise with no access to memories of past experiences (online learning). We found that episodic memory benefits spatial learning in a broad range of conditions, but the performance difference is meaningful only when the task is sufficiently complex and the number of learning trials is limited. Furthermore, the two modes of accessing episodic memory affect spatial learning differently. One-shot learning is typically faster than replay learning, but the latter may reach a better asymptotic performance. In the end, we also investigated the benefits of sequential replay and found that replaying stochastic sequences results in faster learning as compared to random replay when the number of replays is limited. Understanding how episodic memory drives future behavior is an important step toward elucidating the nature of episodic memory.
    BibTeX:
    			
                            @article{ZengDiekmannEtAl-2023,
                              author       = {Zeng, Xiangshuai and Diekmann, Nicolas and Wiskott, Laurenz and Cheng, Sen},
                              title        = {Modeling the function of episodic memory in spatial learning},
                              journal      = {Frontiers in Psychology},
                              year         = {2023},
                              volume       = {14},
    			  
    			  url          = {https://www.frontiersin.org/articles/10.3389/fpsyg.2023.1160648},
    			  
                              doi          = {http://doi.org/10.3389/fpsyg.2023.1160648}
                            }
    			
    					
    Zhang, S.; Schönfeld, F.; Wiskott, L. & Manahan-Vaughan, D. 2014 Spatial representations of place cells in darkness are supported by path integration and border information Frontiers in Behavioral Neuroscience , 8 , 222 .
     
    article N. N.
    Abstract: Effective spatial navigation is enabled by reliable reference cues that derive from sensory information from the external environment, as well as from internal sources such as the vestibular system. The integration of information from these sources enables dead reckoning in the form of path integration. Navigation in the dark is associated with the accumulation of errors in terms of perception of allocentric position and this may relate to error accumulation in path integration. We assessed this by recording from place cells in the dark under circumstances where spatial sensory cues were suppressed. Spatial information content, spatial coherence, place field size, and peak and infield firing rates decreased whereas sparsity increased following exploration in the dark compared to the light. Nonetheless it was observed that place field stability in darkness was sustained by border information in a subset of place cells. To examine the impact of encountering the environment’s border on navigation, we analyzed the trajectory and spiking data gathered during navigation in the dark. Our data suggest that although error accumulation in path integration drives place field drift in darkness, under circumstances where border contact is possible, this information is integrated to enable retention of spatial representations.
    BibTeX:
    			
                            @article{ZhangSchoenfeldEtAl-2014,
                              author       = {S. Zhang and F. Schönfeld and L. Wiskott and D. Manahan-Vaughan},
                              title        = {Spatial representations of place cells in darkness are supported by path integration and border information},
                              journal      = {Frontiers in Behavioral Neuroscience},
                              year         = {2014},
                              volume       = {8},
                              pages        = {222},
    			  
    			  url          = {http://journal.frontiersin.org/article/10.3389/fnbeh.2014.00222/full},
    			  
                              doi          = {http://doi.org/10.3389/fnbeh.2014.00222}
                            }
    			
    					
    Zito, T. 2012 Exploring the slowness principle in the auditory domain PhD thesis, Institute for Biology , Humboldt University Berlin, Germany .
     
    phdthesis Independent slow feature analysis (ISFA) (2003-2005), MDP: Modular toolkit for data processing (2003-now), Extended slow feature analysis (xSFA) (2006-2013)
    Abstract: In this thesis we develop models and algorithms based on the slowness principle in the auditory domain. Several experimental results as well as the successful results in the visual domain indicate that, despite the different nature of the sensory signals, the slowness principle may play an important role in the auditory domain as well, if not in the cortex as a whole. Different modeling approaches have been used, which make use of several alternative representations of the auditory stimuli. We show the limitations of these approaches. In the domain of signal processing, the slowness principle and its straightforward implementation, the Slow Feature Analysis algorithm, has been proven to be useful beyond biologically inspired modeling. A novel algorithm for nonlinear blind source separation is described that is based on a combination of the slowness and the statistical independence principles, and is evaluated on artificial and real-world audio signals. The Modular toolkit for Data Processing open source software library is additionally presented.
    BibTeX:
    			
                            @phdthesis{Zito-2012,
                              author       = {Tiziano Zito},
                              title        = {Exploring the slowness principle in the auditory domain},
                              school       = {Institute for Biology},
                              year         = {2012},
    			  
    			  url          = {http://edoc.hu-berlin.de/docviews/abstract.php?id=39096},
    			  
                              doi          = {http://doi.org/10.18452/16450}
                            }
    			
    					
    Zito, T.; Wilbert, N.; Wiskott, L. & Berkes, P. 2009 Modular toolkit for Data Processing (MDP): a Python data processing framework Frontiers in Neuroinformatics , 2(8) .
     
    article MDP: Modular toolkit for data processing (2003-now)
    Abstract: Modular toolkit for Data Processing (MDP) is a data processing framework written in Python. From the user's perspective, MDP is a collection of supervised and unsupervised learning algorithms and other data processing units that can be combined into data processing sequences and more complex feed-forward network architectures. Computations are performed efficiently in terms of speed and memory requirements. From the scientific developer's perspective, MDP is a modular framework, which can easily be expanded. The implementation of new algorithms is easy and intuitive. The new implemented units are then automatically integrated with the rest of the library. MDP has been written in the context of theoretical research in neuroscience, but it has been designed to be helpful in any context where trainable data processing algorithms are used. Its simplicity on the user's side, the variety of readily available algorithms, and the reusability of the implemented units make it also a useful educational tool.
    BibTeX:
    			
                            @article{ZitoWilbertEtAl-2009,
                              author       = {T. Zito and N. Wilbert and L. Wiskott and P. Berkes},
                              title        = {Modular toolkit for {D}ata {P}rocessing ({MDP}): a {P}ython data processing framework},
                              journal      = {Frontiers in Neuroinformatics},
                              publisher    = {Frontiers Research Foundation},
                              year         = {2009},
                              volume       = {2},
                              number       = {8},
    			  
    			  url          = {http://www.frontiersin.org/neuroinformatics/paper/10.3389/neuro.11/008.2008/},
                              url2         = {https://www.ini.rub.de/PEOPLE/wiskott/Reprints/ZitoWilbertEtAl-2009-FrontiersInNeuroinf.pdf},
    			  
                              doi          = {http://doi.org/10.3389/neuro.11.008.2008}
                            }
    			
    					
    Zito, T. & Wiskott, L. 2006 Diagonalization of time-delayed covariance matrices does not guarantee statistical independence in high-dimensional feature space Proc. ICA Research Network International Workshop, Sep 18-19, Liverpool, UK , 120-122 .
     
    inproceedings Independent slow feature analysis (ISFA) (2003-2005)
    BibTeX:
    			
                            @inproceedings{ZitoWiskott-2006,
                              author       = {Tiziano Zito and Laurenz Wiskott},
                              title        = {Diagonalization of time-delayed covariance matrices does not guarantee statistical independence in high-dimensional feature space},
                              booktitle    = {Proc.\ ICA Research Network International Workshop, Sep 18-19, Liverpool, UK},
                              year         = {2006},
                              pages        = {120--122}
                            }
    			
    					

    Created by JabRef on 25/11/2023. Original JabRef Export Filter by Mark Schenk and Holger Jeromin, adapted at RUB INI.