## DataNinja Spring School

The group recently attended the DataNinja Spring School, held online from the 23rd to 25th of March. The Spring School was organized by the University of Bielefeld and focused on “Artificial Intelligence – perspectives and challenges”. Maribel Acosta held a talk on “Symbolic and Sub-symbolic Representations of Knowledge Graphs”

## Tutorial at the KnowGraphs Winter School 2022

Maribel Acosta presented a tutorial on "Querying Federations of Knowledge Graphs" at the KnowGraphs Winter School 2022. The Marie Curie ITN-ETN KnowGraphs is a training network that focus on research about Knowledge Graphs. The Winter School 2022 was held online from 31.01 to 02.02.2022, and comprised 9 tutorials and 3 data challenges.

## A brief introduction to Slow Feature Analysis

One of the main research topics of the TNS group is called Slow Feature Analysis. Slow feature analysis (SFA) is an unsupervised learning method to extract the slowest or smoothest underlying functions or features from a time series. This can be used for dimensionality reduction, regression and classification. In this post we will provide a code example where SFA is applied, to help motivate the method. Then we will go into more detail about the math behind the method and finally provide links to other good resources on the material.

## An extension to Slow Feature Analysis (xSFA)

Following our previous tutorial on Slow Feature Analysis (SFA) we now talk about xSFA - an unsupervised learning algorithm and extension to the original SFA algorithm that utilizes the slow features generated by SFA to reconstruct the individual sources of a nonlinear mixture, a process also known as Blind Source Separation (e.g. the reconstruction of individual voices from the recording of a conversation between multiple people). In this tutorial, we will provide a short example to demonstrate the capabilities of xSFA, discuss its limits, and offer some pointers on how and when to apply it. We also take a closer look at the theoretical background of xSFA to provide an intuition for the mathematics behind it.

## Modeling the hippocampus, part I: Why the hippcampus?

In this multi-part series I'd like to give an introduction into how computational neuroscience can work hand in hand with experimental neuroscience in order to help us understand how the mammalian brain works. As a case study we'll take a look at modeling the hippocampus, a central and essential structure in our daily dealings with reality. In part I of the series we first take a look at the hippocampus, its role in the brain, and what makes this particular structure so uniquely fascinating.

## Modeling the hippocampus, part II: Hippocampal function.

In this multi-part series I'd like to give an introduction into how computational neuroscience can work hand in hand with experimental neuroscience. In part II of this series we take a look at some of the fundamental problems of understanding brain computations. In order to get an idea about hippocampal function we also talk about its involvement in human memory and how we came to know about it.

## Modeling the hippocampus, part III: Spatial processing in the hippocampus.

In this multi-part series I'd like to give an introduction into how computational neuroscience can work hand in hand with experimental neuroscience to understand the mammalian hippocampus. In this third part of the series we take a look at the role of the hippocampus in spatial processing in rodents to get a better idea of the computation the hippocampus provides our brains with.

## Ratlab: a toolkit for studying spatially selective neurons with Slow Feature Analysis

Here I give a (brief) overview of SFA, before introducing the Ratlab toolkit and its main modules, and finally showing some of the different results that you can get with this software. I hope that you enjoy learning about this fun and versatile toolkit!

## About the INI

The Institut für Neuroinformatik (INI) is a central research unit of the Ruhr-Universität Bochum. We aim to understand the fundamental principles through which organisms generate behavior and cognition while linked to their environments through sensory systems and while acting in those environments through effector systems. Inspired by our insights into such natural cognitive systems, we seek new solutions to problems of information processing in artificial cognitive systems. We draw from a variety of disciplines that include experimental approaches from psychology and neurophysiology as well as theoretical approaches from physics, mathematics, electrical engineering and applied computer science, in particular machine learning, artificial intelligence, and computer vision.

## Contact

Universitätsstr. 150, Building NB, Room 3/32
D-44801 Bochum, Germany

Tel: (+49) 234 32-28967
Fax: (+49) 234 32-14210