- RUB
- Computer Science
- INI
- Courses
- Numerical Optimization
Numerical Optimization
Content:
This course offers a contemporary introduction to numerical optimization. Optimization algorithm find applications in many areas of engineering, economics, machine learning, and many more. This course covers the most prominent design principles and algorithm classes:
· gradient and Newton search directions
· line search and trust region methods
· conjugate gradients
· quasi Newton algorithms
· constraint handling, duality
· linear programming, including the mixed integer case
· direct search (gradient-free) methods
Methods are presented and analyzed in the lecture, and implemented and tested in the exercise sessions.
During the practical sessions, participants work on a mix of conceptual and practical exercises. Many practical exercises involve programming in Python.
Learning Outcomes:
The participants can explain important classes of optimization algorithms, like conjugate gradients (CG), BFGS, CMA-ES, and the simplex algorithm for linear programming. They understand properties of different types of search directions (sampling methods, gradient, conjugate gradient, Newton step and quasi Newton methods), as well step size control mechanisms (line search and trust region methods). They can relate these methods to convergence speed classes like linear and super-linear convergence. They understand Lagrangians and they can derive dual optimization problems. They can model real-world problems as mathematical optimization problems. They can pick suitable algorithms, apply them to actual optimization problems, and solve them efficiently.
Recommended Prior Knowledge:
This course requires a solid grasp on various mathematical concepts for defining and analyzing optimization algorithms with mathematical tools, including a few proofs. Participants need a solid understanding of linear algebra (matrices, eigen decomposition, positive definiteness), analysis (sequences, convergence, convexity, gradient and Hessian matrix), and probability theory (multi-variate normal distribution). Prior knowledge of numerics is a plus, but not required. The exercise sessions as well as the exam require basic Python programming.
Exam:
Written electronic exam ( 90 minutes)
Lecturers
![]() Prof. Dr. Tobias GlasmachersLecturer |
(+49) 234-32-25558 tobias.glasmachers@ini.rub.de NB 3/27 |
Details
- Course type
- Lectures
- Credits
- 6
- Term
- Winter Term 2025/2026
Dates
- Lecture
-
Takes place
every week on Thursday from 10:15 to 13:15.
First appointment is on 16.10.2025
Last appointment is on 05.02.2026
Requirements
This course does not have formal requirements. The target audience includes Master students of all technical subjects, like computer science, mathematics, physics, and engineering.
Literature:
1. “Numerical Optimization”, Nocedal and Wright
2. “Introduction to derivative-free optimization”, Conn, Scheinberg and Vicente
3. “The CMA evolution strategy: A tutorial”, Hansen
The Institut für Neuroinformatik (INI) is a research unit of the Faculty of Computer Science at the Ruhr-Universität Bochum. Its scientific goal is to understand the fundamental principles through which organisms generate behavior and cognition while linked to their environments through sensory and effector systems. Inspired by our insights into such natural cognitive systems, we seek new solutions to problems of information processing in artificial cognitive systems. We draw from a variety of disciplines that include experimental psychology and neurophysiology as well as machine learning, neural artificial intelligence, computer vision, and robotics.
Universitätsstr. 150, Building NB, Room 3/32
D-44801 Bochum, Germany
Tel: (+49) 234 32-28967
Fax: (+49) 234 32-14210