I am a research affiliate at the Theory of Neural Systems group (Prof. Dr. Laurenz Wiskott), Institut für Neuroinformatik, Ruhr-Universität Bochum, Germany. In my current research, I explore synergies between neural networks constructed using Slow Feature Analysis (SFA) and conventional deep neural networks.

**Previous research.** As a PhD candidate, I proposed different extensions to Slow Feature Analysis (SFA) for supervised learning, such as Hierarchical Information-Preserving Graph-Based SFA (HiGSFA). The extensions perform supervised dimensionality reduction, that is, from high-dimensional input data they compute output features that concentrate the label-predictive information. On top of this low-dimensional feature representation, which is computed in practice using a neural network consisting of various SFA layers, one can apply simple supervised learning algorithms to solve the corresponding classification and regression problems with good accuracy.

This work is closely related to spectral learning, unsupervised and (semi-)supervised learning, deep learning, and clustering, and is connected to algorithms such as Principal Component Analsis (PCA), Independent Component Analysis (ICA), Locality Preserving Projections (LPP), and Laplacian Eigenmaps (LE).

The approach that I have followed to develop the extensions relies on a few simple but strong principles and heuristics. The main principle behind SFA is called the *slowness principle*, which is complemented in HiGSFA by *information preservation* as well as by additional heuristics, such as *hierarchical processing* (hierarchical SFA), *multiple label learning*, the use of *deep network **architecture**s,* and normalized nonlinear expansion functions. The extensions greatly improve the training efficiency, the quality of the features extracted, and the generalization to new data compared to previous versions of SFA.

The possible applications of the developed algorithms are vast, because they are domain independent and can be applied to many types of high-dimensional data. For the experimental evaluation, I focused on face processing-problems on images, including the estimation of age, race and gender. The resulting label estimation accuracies range from good to competitive. Most notably, HiGSFA outperformed (late 2016 – early 2017) existing *state-of-the-art* systems for the age estimation problem on the MORPH-II database (however, new approaches, like a 5-branched VGG network and deep forests have now best accuracy). Other problems that I have addressed include face detection, digit recognition, classification of traffic signs, and unsupervised learning from the visual input of a simulated rat.

For a full description of the methods, experiments, and results please see my dissertation (Escalante-B., A. N., 2017, Extensions of Hierarchical Slow Feature Analysis for Efficient Classification and Regression on High-Dimensional Data).