• RUB
  • INI
  • Courses
  • Topics in Deep Learning for Sequence Processing

Topics in Deep Learning for Sequence Processing

Limited number of participants! Please register via Moodle: Masters Seminar: Topics in deep learning for sequence processing

Natural language processing, robotics, video processing, stock market forecasting and other similar tasks require models that can deal with sequence data and understand temporal dependencies. Two major classes of models that have been designed to deal with sequence data are recurrent neural networks (RNNs/LSTMs) and transformer architectures. Designing and understanding these models is a very active and diverse area of research. Applications of these models are also widespread. The recent explosion of interest in topics such as language modelling and machine translation is based on advances in these models which includes GPT-3, DALL-E, etc.

In this course you’ll first understand the fundamentals of recurrent neural networks and transformers that led to these breakthroughs. Then we’ll go through and discuss both seminal and recent research papers on these topics to throw light on algorithms and challenges in this field.

The allocation of the limited seminar places is done via the Moodle course "Masters Seminar: Topics in deep learning for sequence processing" for the corresponding semester. Registration for the winter semester is possible until 11.10.2021. Please enter your degree program in the comment field. Places are allocated with priority to Master students of Applied Computer Science, but the course is open to students of other faculties who fulfill the content requirements.

Lecturers

Details

Course type
Seminars
Credits
3
Term
Winter Term 2021/2022
E-Learning
moodle course available

Dates

Seminar
Takes place every week on Wednesday from 14:00 to 16:00.
First appointment is on 13.10.2021
Last appointment is on 02.02.2022

Requirements

We expect a solid level of mathematics as taught in the Applied Computer Science Bachelor‘s. Tools commonly used in machine learning are

  • basic probability theory/statistics (expectations, variance, foundational distributions and densities, markov chains)
  • linear algebra (matrices, vectors, eigenvalues/eigenvectors)
  • calculus (functions, derivatives/gradients, simple integrals)

 The course material is in English, the course language will be English.

Taking the course on Artificial Neural Networks either before or in parallel to this course is recommended (but not required).

The Institut für Neuroinformatik (INI) is a central research unit of the Ruhr-Universität Bochum. We aim to understand the fundamental principles through which organisms generate behavior and cognition while linked to their environments through sensory systems and while acting in those environments through effector systems. Inspired by our insights into such natural cognitive systems, we seek new solutions to problems of information processing in artificial cognitive systems. We draw from a variety of disciplines that include experimental approaches from psychology and neurophysiology as well as theoretical approaches from physics, mathematics, electrical engineering and applied computer science, in particular machine learning, artificial intelligence, and computer vision.

Universitätsstr. 150, Building NB, Room 3/32
D-44801 Bochum, Germany

Tel: (+49) 234 32-28967
Fax: (+49) 234 32-14210