welcome to the
Homepage of
Tobias Glasmachers

Table of Contents

Contact ....................................................................................................
Short CV ....................................................................................................
Teaching ....................................................................................................
Research ....................................................................................................
Projects ....................................................................................................
Open Positions ....................................................................................................
Software ....................................................................................................
Publications ....................................................................................................
Homepage of Tobias Glasmachers - Contact

Address

Building NB, Room 3/27
Institut für Neuroinformatik
Ruhr-Universität Bochum
Universitätsstr. 150
44780 Bochum, Germany

Telephone

+49-(0)234-32-25558

E-mail

Homepage of Tobias Glasmachers - Short CV

Homepage of Tobias Glasmachers - Teaching

Lectures

summer term 2016:
winter term 2015/2016:

Theses

I am offering Master theses in the areas of machine learning and optimization.

Prerequisites:

Please contact me for details, and for currently open topics.

Homepage of Tobias Glasmachers - Teaching

Previous Lectures and Seminars

Summer term 2015 "Machine Learning – Supervised Methods"
Winter term 2014/15 "Machine Learning – Evolutionary Algorithms"
Summer term 2014 "Machine Learning – Supervised Methods"
Summer term 2014 Seminar "Dimensionsreduktion und Lernen von Mannigfaltigkeiten"
Winter term 2013/14 "Evolutionäre Algorithmen"
Summer term 2012 "Machine Learning – Supervised Methods"
Winter term 2012/13 "Evolutionäre Algorithmen"
Summer term 2012 "Machine Learning – Supervised Methods"
Homepage of Tobias Glasmachers - Research

My research is located in the area of machine learning, a modern branch of artificial intelligence research. This is an interdisciplinary research topic in between computer science, statistics, and optimization, with connections to the neurosciences and applications in robotics, engineering, medicine, economics, and many more disciplines. Within this wide area I am focusing on two aspects: supervised learning, mostly with support vector machines, and optimization with gradient methods and evolutionary algorithms.

Supervised Learning

Supervised learning is a learning paradigm with endless (mostly technical) applications. A learning machine (algorithm) builds a predictive model from data provided in the form of input/output pairs. Primary examples are classification and regression problems. Support vector machines (SVMs) have advanced to a standard method in the field. On the one hand I am interested in the SVM training problem, which basically amounts to large scale quadratic programming. On the other hand I am trying to simplify SVM usage for non-experts by developing robust methods for automatic model selection. My research activities include both theoretical and practical aspects ranging from SVM optimization to experimental comparison studies and software development.

Optimization

Gradient-based optimization, particularly relatively simple first order methods like (stochastic) gradient descent and coordinate descent, are at the heart of many modern training procedures for learning machines, in particular for (possibly regularized) empirical risk minimization.

Homepage of Tobias Glasmachers - Research

Evolutionary Algorithms (EAs) are a class of nature-inspired algorithms that mimic the process of Darwinian evolution. This process is resolved into the components inheritance, variation, and selection. It has been widely recognized that EAs are useful for search and optimization. Formally they can be understood as randomized direct search heuristics. They are suitable for black-box optimization problems. I focus on evolution strategies, a class of optimization algorithms for continuous variables, and in multi-objective optimization.

Homepage of Tobias Glasmachers - Projects

Duales Training nichtlinearer Support-Vektor-Maschinen mit Budget

This DFG funded research project will start in 2016. I am searching for a motivated PhD candidate!

The Black-box Optimization Competition (BBComp)

The Black-box Optimization Competition (BBComp) is an online competition for black-box optimization in the continuous domain. It is the first competition of its kind where problems are truly black-boxes to participants. This competition allows for a fair and unbiased (as unbiased as possible) comparison of black box optimization methods. The large problem suite and the black-box interface avoid over-fitting to narrow suites of benchmark problems.

Support-Vektor-Maschinen für extrem große Datenmengen

This research project had started in November 2013 and ended in February 2016. It was conducted in cooperation with the chair Computergestützte Statistik at the Technical University of Dortmund. It was funded by the Mercator Research Center Ruhr (MERCUR). The official project homepage is found here.

Homepage of Tobias Glasmachers - Open Positions

I have an open Ph.D. position in the area of kernel methods, starting in September 2016.

Homepage of Tobias Glasmachers - Software

Shark

I am an active developer of the Shark Machine Learning Library. Shark is an open-source, modular, and fast C++ library. A large share of my research code is either part of the library or based thereon. Check it out!

Asynchronous ES

An asynchronous natural evolution strategy.

Adaptive Coordinate Frequencies Coordinate Descent

Coordinate descent with online adaptation of coordinate frequencies for fast training of linear models.

Hypervolume Maximization

Maximization of dominated hypervolume for multi-objective benchmark problems.

xCMA-ES

CMA-ES with multiplicative covariance update.

Pareto Archive

An efficient archiving algorithm for non-dominated solutions in multi-objective optimization.