Computational Neuroscience: Neural Dynamics
This course will be taught in the inverted classroom format combining presence and online features.
For each lecture, a video will be made available that students should watch BEFORE the lecture hour. During the lecture hour, we'll discuss the material presented in the video. So to profit from that, you MUST have seen the video beforehand. That discussion takes place in the classroom, but can also be followed in real time through a Zoom channel. Both students in the class room and students following online should and can ask questions.
The same format and same Zoom channel will be used for the exercise sessions. In these, the solutions of the corrected exercises will be discussed. The exercise session can also be used to ask general questions.
The course uses e-learning features provided through the present webpages (see under "E-LEARNING"). This course is NOT managed through moodle! Once registered, the e-learning webpages at ini.rub.de will give you access to the video lectures, the zoom room, the lecture slides, readings, exercise sheets, and more. You will upload your exercise solutions and will see the marked corrections to your solutions there. You can also ask questions in the "discussion forum".
To take the course, you must, therefore, register through this webpage: Go to "e-learning", select this course, and follow the instructions there. You will need an email address of the Ruhr-University or the Technical University Dortmund for registration. If you are an exchange student without such an email address or come from another university within the Ruhr-Alliance, contact us by email as instructed there. When registering, please fill in your degree program (for example, "MSC Angewandte Informatik", not just "Master of Science"). This is important information for us to manage exams and credit points.
This course lays the foundations for a neurally grounded understanding of the fundamental processes in perception, in cognition, and in motor control, that enable intelligent action in the world. The theoretical perspective is aligned with ideas from embodied and situated cognition, but embraces concepts of neural representation and aims to reach higher cognition. Neural grounding is provided at the level of populations of neurons in the brain that form strongly recurrent neural networks and are ultimately linked to the sensory and motor surfaces.
The theoretical concepts on which the course is based come form dynamical systems theory. These concepts are used to characterize neural processes in strongly recurrent neural networks as neural dynamic systems, in which stable activation states emerge from the connectivity patterns within neural populations. These connectivity patterns imply that neural populations represent low-dimensional features spaces. This leads to neural dynamic fields of activation as the building blocks of neural cognitive architectures. Dynamic instabilities induce change of attractor states from which cognitive functions such as detection, change, or selection decisions, working memory, and sequences of processing stages emerge.
The course partially follows a textbook (Dynamic Thinking—A primer on Dynamic Field Theory, Schöner, Spencer, and the DFT research group. Oxford University Press, 2016), of which chapters will serve as reading material. Exercises will focus on hands-on simulation experiments, but also involve readings and the writing of short essays on interdisciplinary research topics. See www.dynamicfieldtheory.org for some of that material. Tutorials on mathematical concepts are provided, so that training in calculus and differential equations is useful, but not a prerequisite for the course.
Prof. Dr. Gregor SchönerLecturer
|(+49) 234-32-27965 firstname.lastname@example.org NB 3/31|
Daniel Sabinasz, M.Sc.Teaching Assistant (primary contact)
|(+49) 234-32-27973 email@example.com NB 02/74|
Raul Grieben, M.Sc.Tutor
|(+49) 234-32-27973 firstname.lastname@example.org NB 02/74|
Rebecca Baldi, M.Sc.Tutor
|(+49) 234-32-24201 email@example.com NB 02/73|
|(+49) 234-32-27971 firstname.lastname@example.org NB 02/76|
Stephan Sehring, M.Sc.Tutor
|(+49) 234-32-27976 email@example.com NB 02/75|
|(+49) 234-32-27976 firstname.lastname@example.org NB 02/75|
- Course type
- 6 CP
- Winter Term 2022/2023
- e-learning course available
every week on Thursday from 14:15 to 16:00 in room NB 3/57.
First appointment is on 13.10.2022
Last appointment is on 02.02.2023
every week on Thursday from 16:00 to 16:45 in room NB 3/57.
First appointment is on 20.10.2022
Last appointment is on 02.02.2023
This course requires some basic math preparation, typically as covered in two semesters of higher mathematics (functions, differentiation, integration, differential equations, linear algebra). The course does not make extensive use of the underlying mathematical techniques, but uses the mathematical concepts to express scientific ideas. Students without prior training in the relevant mathematics may be able to follow the course, but will have to work harder to familiarize themselves with the concepts.
Exercises are organized by Daniel Sabinasz Details on grading are available in the course rules below.
The course will be based on selected chapters of a textbook (Dynamic Thinking: A Primer on Dynamic Field Theory by Schöner, G., Spencer, J, and the DFT Research Group, Oxford University Press). The Introduction and the first two chapters are available for download in the course materials below. These and others will also serve as readings for some of the exercises.
For the mathematical background in dynamical systems an excellent resource is a book that is available online as a free download (thanks to the author's generosity): Edward R. Scheinerman's Invitation to Dynamical Systems. This book covers both discrete and continuous time dynamical systems, while in the course we will only make use of continuous time dynamical systems formalized as differential equations.
Organization of the course
This will be discussed in the first live session...
|Document||Rules for credit|
Watch this for an introduction into a topic and an overview over the course.
Dynamical systems tutorial
|Lecture slides||Dynamical systems tutorial part 1|
Dynamical systems tutorial part 1
This lecture given by Sophie Aerdker gives a brief introduction into foundational concepts from the mathematics of dynamical systems as preparation for the neural dynamics in Dynamic Field Theory, covered in the rest of the course.
Dynamical systems tutorial part 2
The second part of the dynamical systems tutorial presented by Sophie Aerdker as background for the Neural Dynamics course. This covers bifurcations and their significance for modeling.
|Video||Dynamical systems tutorial part 2|
This Matlab code illustrates the ideas of numerical simulation of differential equations... As a RUB student you have access to Matlab here.
Note: this is the most mathematical of all exercises. For mathematically skilled participants, this should not be hard, but hopefully insightful. For those who have not practiced math for a while, this will be hard. But the remainder of the course and of the exercises will NOT be in this style, so do not dispair if you struggle with this exercise sheet.
|Lecture slides||Embodied nervous systems: Braitenberg vehicles|
Embodied nervous systems: Braitenberg vehicles
This lecture is part of the introductory portion of the course in neural dynamics. It uses the metaphor proposed by Valentino Braitenberg of organisms as vehicles, with sensors, motors, a body that connects these mechanically, and a nervous system that connects these neurally, embedded in a structured environment. We see how behavior emerges from intuitive mental simulation and how this can be made exact based on models of the environment and of the vehicle. This leads to the notion of behavioral attractor dynamics. I discuss the relation to cybernetic thinking and point to how neural dynamics goes beyond this simple case.
|Exercises||Exercise 2: Braitenberg vehicles|
|Lecture slides||Neural dynamics|
This is the first lecture in the course that introduces neural dynamics properly speaking. Motivated by the dynamics of the membrane potential of neurons, the basic equations is introduced and illustrated. The simplest recurrent network of one neuron coupled excitatorily to itself is used to introduce the detection instability. Two neurons that are inhibitorily coupled exemplify competitive selection.
|Exercises||Exercise 4 on neural dynamics|
Foundations of Dynamic Field Theory (DFT)
|Lecture slides||DFT: Foundations and detection|
DFT: Foundations and detection
This is the core lecture on Dynamic Field Theory for the Neural Dynamics course. It introduces the notion of a neural dynamic field, making sense of the dimensions over which such fields are defined, and proceeds to discuss the basic attractor states and their instabilities. The detection instability is discussed in some depth and linked to psychophysical evidence.
|Exercises||Exercise 5: detection|
|Lecture slides||DFT: Selection|
This is the second core lecture on Dynamic Field Theory for the Neural Dynamics course. It focusses on selection decisions. I first review how such decisions are made in DFT and what functional properties emerge from that mechanism. Then i discuss the limited evidence we have about "free" choice decisions by reviewing work on saccadic eye movements. The reaction time paradigm is then discussed in light of DFT accounts. Finally, selection decisions in the timed-movement-initiation-paradigm are presented as a major source of empirical support from the DFT framework of selection.
|Exercises||Exercise 6: Selection|
|Document||Reading for Exercise 6: movement preparation|
This lecture reviews both the memory trace and working memory as two further foundational elements of Dynamic Field Theory. I also introduce briefly the 3-layer field model of change detection and show how this accounts for signatures of visual working memory. The A not B paradigm of Piaget is used to illustrate all these ideas.
|Lecture slides||DFT: Memory|
|Exercises||Exercise 7 Memory|
|Lecture slides||DFT embodied|
This short lecture illustrates how neural dynamic fields can be directly driven from time varying sensory inputs and can conversely drive motor behaviors in closed loop.
|Lecture slides||DFT Neural basis|
DFT neural basis
A short lecture about how neural dynamic fields are linked to distributions of neural population activation.
Higher dimensional fields
Higher dimensional fields: Binding, search, and coordinate transforms
This lecture explores neural dynamic fields in higher dimensions. When these dimensions combine different features, they represent bound objects. We show how this enables new cognitive functions, most prominently search, exemplified in visual search. The scaling problem with an increasing number of dimensions is addressed and localist representations are contrasted to distributed representations. The binding through space of Feature Integration Theory provides a solution for our localist approach in DFT. Finally, I show how coordinate transforms are enabled by bound representations and how they are a possible reason for the attentional bottleneck in visual (and other) cognition.
|Lecture slides||Higher dimensional fields: binding, search, coordinate transforms|
What is DFT?
|Exercises||Essay exercise 8: What is DFT?|
|Lecture slides||Sequence generation|
This is an edited version of a video from the 2022 DFT summer school that provides the core ideas around autonnomous generation of sequences of mental or motor states.
Toward higher cognition
|Video||DFT models of grounded cognition|
|Lecture slides||DFT models of grounded cognition|
DFT models of compositionality (scibo)
In this lecture, Daniel Sabinasz reviews the famous notion of productivity, namely, the ability to flexibly join “atomic“ linguistic units into “molecular“ linguistic units, and to join molecular linguistic units into more complex molecular linguistic units. He further reviews the notion of compositionality, which accounts for how we understand molecular expressions by virtue of understanding the meanings of their parts and the way that the parts are combined. This leads to a discussion of how compositionality may be achieved by neural systems in a way that is consistent with the principles formalized in Dynamic Field Theory
DFT models of compositionality (youtube)
|Lecture slides||DFT models of compositionality|
Here are some chapters from the book "Dynamic Thinking -- A primer on Dynamic Field Theory" (Schöner, Spencer and the DFT Research Group, Oxford University Press, 2016), which may serve as background reading for the course.
Project: Implementing a simple Visual Search architecture with CEDAR
This tutorial will walk you through the steps of implementing a simple visual search architecture with CEDAR. This is an optional exercise for you and will give you practice in how to build cognitive architectures.
Template for the visual search project
|Reference solution||Solution to visual search exercise|
The Institut für Neuroinformatik (INI) is a central research unit of the Ruhr-Universität Bochum. We aim to understand the fundamental principles through which organisms generate behavior and cognition while linked to their environments through sensory systems and while acting in those environments through effector systems. Inspired by our insights into such natural cognitive systems, we seek new solutions to problems of information processing in artificial cognitive systems. We draw from a variety of disciplines that include experimental approaches from psychology and neurophysiology as well as theoretical approaches from physics, mathematics, electrical engineering and applied computer science, in particular machine learning, artificial intelligence, and computer vision.
Universitätsstr. 150, Building NB, Room 3/32
D-44801 Bochum, Germany
Tel: (+49) 234 32-28967
Fax: (+49) 234 32-14210