Prof. Dr. Tobias Glasmachers
Theory of Machine Learning
Optimization of Adaptive Systems
Institut für Neuroinformatik
RuhrUniversität Bochum
Universitätsstraße 150
Building NB, Room NB 3/27
Universitätsstraße 150
Building NB, Room NB 3/27
D44801 Bochum, Germany
About Me
I am a professor for theory of machine learning at the Institut für Neuroinformatik, RuhrUniversität Bochum, Germany. My research interests are (supervised) machine learning and optimization.
Short CV
 20042008: Ph.D. in Christian Igel's group at the Institut für Neuroinformatik in Bochum. I received my Ph.D. in 2008 from the Faculty of Mathematics, RuhrUniversität Bochum, Germany.
 20082009: Postdoc in the same group.
 20092011: Postdoc in Jürgen Schmidhuber's group at IDSIA, Lugano, Switzerland.
 since 2012: Junior professor for theory of machine learning at the Institut für Neuroinformatik, RuhrUniversität Bochum, Germany. I am the head of the optimization of adaptive systems group.
 2018: Promotion to full professor (in progress).
Research
My research is located in the area of machine learning, a modern branch of artificial intelligence research. This is an interdisciplinary research topic in between computer science, statistics, and optimization, with connections to the neurosciences and applications in robotics, engineering, medicine, economics, and many more disciplines. Within this wide area I am focusing on two aspects: supervised learning (including modern deep learning), and optimization with simple gradientbased methods and evolutionary algorithms.
Machine Learning
Supervised learning is a learning paradigm with endless (mostly technical) applications. A learning machine (algorithm) builds a predictive model from data provided in the form of input/output pairs. This allows for the automated solution of classification and regression problems. A primary example is classification of objects in images, a classic computer vision task. I have recently started to reach out to reinforcement learning problems in 3D environments for fully autonomous behavior learning of robots or computer game agents (bots). My research activities include both theoretical and practical aspects.
Optimization
Gradientbased optimization, particularly relatively simple first order methods like (stochastic) gradient descent and coordinate descent, are at the heart of many modern training procedures for learning machines, in particular for (possibly regularized) empirical risk minimization. This includes backpropagation based training of (deep) neural networks, as well as convex (primal or dual) optimization, e.g., for support vector machine training.
Evolutionary Algorithms (EAs) are a class of natureinspired algorithms that mimic the process of Darwinian evolution. This process is resolved into the components inheritance, variation, and selection. It has been widely recognized that EAs are useful for search and optimization, in particular when derivatives are not available. Formally they can be understood as randomized direct search heuristics. They are suitable for tackling blackbox optimization problems. I focus on evolution strategies, a class of optimization algorithms for continuous variables, and on multiobjective optimization.
Shark
I am an active developer of the Shark Machine Learning Library. Shark is an opensource, modular, and fast C++ library. Check it out!
Asynchronous ES
An asynchronous natural evolution strategy.
Adaptive Coordinate Frequencies Coordinate Descent
Coordinate descent with online adaptation of coordinate frequencies for fast training of linear models.
LASSOcode, modified liblinear.
Hypervolume Maximization
Maximization of dominated hypervolume for multiobjective benchmark problems.
xCMAES
CMAES with multiplicative covariance update.
Pareto Archive
An efficient archiving algorithm for nondominated solutions in multiobjective optimization.
Stochastic Gradient Optimization
Comparison of SGD, SAG, SVRG, and ADAM for training kernel machines on a budget.
Limits of Endtoend Learning
Duales Training nichtlinearer SupportVektorMaschinen mit Budget
This DFG funded research project has started in October 2016.
The Blackbox Optimization Competition (BBComp)
The Blackbox Optimization Competition (BBComp) is an online competition for blackbox optimization in the continuous domain. It is the first competition of its kind where problems are truly blackboxes to participants. This competition allows for a fair and unbiased (as unbiased as possible) comparison of black box optimization methods. The large problem suite and the blackbox interface avoid overfitting to narrow suites of benchmark problems.
SupportVektorMaschinen für extrem große Datenmengen
This research project had started in November 2013 and ended in February 2016. It was conducted in cooperation with the chair Computergestützte Statistik at the Technical University of Dortmund. It was funded by the Mercator Research Center Ruhr (MERCUR). The official project homepage is found here.

Drift Theory in Continuous Search Spaces: Expected Hitting Time of the (1+1)ES with 1/5 Success RuleAkimoto, Y., Auger, A., & Glasmachers, T.In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO) ACM
@inproceedings{AkimotoAugerGlasmachers2018, author = {Akimoto, Youhei and Auger, Anne and Glasmachers, Tobias}, title = {Drift Theory in Continuous Search Spaces: Expected Hitting Time of the (1+1)ES with 1/5 Success Rule}, booktitle = {Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)}, publisher = {ACM}, year = {2018}, }
Akimoto, Y., Auger, A., & Glasmachers, T.. (2018). Drift Theory in Continuous Search Spaces: Expected Hitting Time of the (1+1)ES with 1/5 Success Rule. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO). ACM. 
A Fast Incremental BSP Tree Archive for Nondominated PointsGlasmachers, T.In Evolutionary MultiCriterion Optimization (EMO) Springer
@inproceedings{Glasmachers2017, author = {Glasmachers, T.}, title = {A Fast Incremental BSP Tree Archive for Nondominated Points}, booktitle = {Evolutionary MultiCriterion Optimization (EMO)}, publisher = {Springer}, year = {2017}, }
Glasmachers, T. (2017). A Fast Incremental BSP Tree Archive for Nondominated Points. In Evolutionary MultiCriterion Optimization (EMO). Springer. 
Limits of EndtoEnd LearningGlasmachers, T.In Proceedings of the 9th Asian Conference on Machine Learning (ACML)
@inproceedings{Glasmachers2017b, author = {Glasmachers, T.}, title = {Limits of EndtoEnd Learning}, booktitle = {Proceedings of the 9th Asian Conference on Machine Learning (ACML)}, year = {2017}, }
Glasmachers, T. (2017). Limits of EndtoEnd Learning. In Proceedings of the 9th Asian Conference on Machine Learning (ACML). 
Global Convergence of the (1+1) Evolution StrategyGlasmachers, T.arxiv.org
@techreport{Glasmachers2017c, author = {Glasmachers, T.}, title = {Global Convergence of the (1+1) Evolution Strategy}, institution = {arxiv.org}, number = {arxiv:1706.02887}, year = {2017}, }
Glasmachers, T. (2017). Global Convergence of the (1+1) Evolution Strategy (No. arxiv:1706.02887). arxiv.org. 
Texture attribute synthesis and transfer using feedforward CNNsIrmer, T., Glasmachers, T., & Maji, S.In Winter Conference on Applications of Computer Vision (WACV) IEEE
@inproceedings{IrmerGlasmachersMaji2017, author = {Irmer, T. and Glasmachers, T. and Maji, S.}, title = {Texture attribute synthesis and transfer using feedforward CNNs}, booktitle = {Winter Conference on Applications of Computer Vision (WACV)}, publisher = {IEEE}, year = {2017}, }
Irmer, T., Glasmachers, T., & Maji, S. (2017). Texture attribute synthesis and transfer using feedforward CNNs. In Winter Conference on Applications of Computer Vision (WACV). IEEE. 
Qualitative and Quantitative Assessment of Step Size Adaptation RulesKrause, O., Glasmachers, T., & Igel, C.In Conference on Foundations of Genetic Algorithms (FOGA) ACM
@inproceedings{KrauseGlasmachersIgel2017, author = {Krause, O. and Glasmachers, T. and Igel, C.}, title = {Qualitative and Quantitative Assessment of Step Size Adaptation Rules}, booktitle = {Conference on Foundations of Genetic Algorithms (FOGA)}, publisher = {ACM}, year = {2017}, }
Krause, O., Glasmachers, T., & Igel, C. (2017). Qualitative and Quantitative Assessment of Step Size Adaptation Rules. In Conference on Foundations of Genetic Algorithms (FOGA). ACM. 
Fast model selection by limiting SVM training timesDemircioğlu, A., Horn, D., Glasmachers, T., Bischl, B., & Weihs, C.arxiv.org
@techreport{DemircioğluHornGlasmachersEtAl2016, author = {Demircioğlu, A. and Horn, D. and Glasmachers, T. and Bischl, B. and Weihs, C.}, title = {Fast model selection by limiting SVM training times}, institution = {arxiv.org}, number = {arxiv:1302.1602.03368v1}, year = {2016}, }
Demircioğlu, A., Horn, D., Glasmachers, T., Bischl, B., & Weihs, C. (2016). Fast model selection by limiting SVM training times (No. arxiv:1302.1602.03368v1). arxiv.org. 
A Unified View on Multiclass Support Vector ClassificationDoğan, Ü., Glasmachers, T., & Igel, C.Journal of Machine Learning Research, 17(45), 1–32
@article{DoğanGlasmachersIgel2016, author = {Doğan, Ü. and Glasmachers, T. and Igel, C.}, title = {A Unified View on Multiclass Support Vector Classification}, journal = {Journal of Machine Learning Research}, volume = {17}, number = {45}, pages = {1–32}, year = {2016}, }
Doğan, Ü., Glasmachers, T., & Igel, C. (2016). A Unified View on Multiclass Support Vector Classification. Journal of Machine Learning Research, 17(45), 1–32. 
Finite Sum Acceleration vs. Adaptive Learning Rates for the Training of Kernel Machines on a BudgetGlasmachers, T.In NIPS workshop on Optimization for Machine Learning
@inproceedings{Glasmachers2016, author = {Glasmachers, T.}, title = {Finite Sum Acceleration vs. Adaptive Learning Rates for the Training of Kernel Machines on a Budget}, booktitle = {NIPS workshop on Optimization for Machine Learning}, year = {2016}, }
Glasmachers, T. (2016). Finite Sum Acceleration vs. Adaptive Learning Rates for the Training of Kernel Machines on a Budget. In NIPS workshop on Optimization for Machine Learning. 
Small Stochastic Average Gradient StepsGlasmachers, T.In NIPS workshop on Optimizing the Optimizers
@inproceedings{Glasmachers2016b, author = {Glasmachers, T.}, title = {Small Stochastic Average Gradient Steps}, booktitle = {NIPS workshop on Optimizing the Optimizers}, year = {2016}, }
Glasmachers, T. (2016). Small Stochastic Average Gradient Steps. In NIPS workshop on Optimizing the Optimizers. 
A Comparative Study on Large Scale Kernelized Support Vector MachinesHorn, D., Demirciğlu, A., Bischl, B., Glasmachers, T., & Weihs, C.Advances in Data Analysis and Classification (ADAC), 1–17
@article{HornDemirciğluBischlEtAl2016, author = {Horn, D. and Demirciğlu, A. and Bischl, B. and Glasmachers, T. and Weihs, C.}, title = {A Comparative Study on Large Scale Kernelized Support Vector Machines}, journal = {Advances in Data Analysis and Classification (ADAC)}, pages = {1–17}, year = {2016}, }
Horn, D., Demirciğlu, A., Bischl, B., Glasmachers, T., & Weihs, C. (2016). A Comparative Study on Large Scale Kernelized Support Vector Machines. Advances in Data Analysis and Classification (ADAC), 1–17. 
Unbounded Population MOCMAES for the BiObjective BBOB Test SuiteKrause, O., Glasmachers, T., Hansen, N., & Igel, C.In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)
@inproceedings{KrauseGlasmachersHansenEtAl2016, author = {Krause, O. and Glasmachers, T. and Hansen, N. and Igel, C.}, title = {Unbounded Population MOCMAES for the BiObjective BBOB Test Suite}, booktitle = {Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)}, year = {2016}, }
Krause, O., Glasmachers, T., Hansen, N., & Igel, C. (2016). Unbounded Population MOCMAES for the BiObjective BBOB Test Suite. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO). 
Multiobjective Optimization with Unbounded Solution SetsKrause, O., Glasmachers, T., & Igel, C.In NIPS workshop on Bayesian Optimization
@inproceedings{KrauseGlasmachersIgel2016, author = {Krause, O. and Glasmachers, T. and Igel, C.}, title = {Multiobjective Optimization with Unbounded Solution Sets}, booktitle = {NIPS workshop on Bayesian Optimization}, year = {2016}, }
Krause, O., Glasmachers, T., & Igel, C. (2016). Multiobjective Optimization with Unbounded Solution Sets. In NIPS workshop on Bayesian Optimization. 
Anytime BiObjective Optimization with a Hybrid MultiObjective CMAES (HMOCMAES)Loshchilov, I., & Glasmachers, T.In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)
@inproceedings{LoshchilovGlasmachers2016, author = {Loshchilov, I. and Glasmachers, T.}, title = {Anytime BiObjective Optimization with a Hybrid MultiObjective CMAES (HMOCMAES)}, booktitle = {Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)}, year = {2016}, }
Loshchilov, I., & Glasmachers, T.. (2016). Anytime BiObjective Optimization with a Hybrid MultiObjective CMAES (HMOCMAES). In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO). 
Supervised ClassificationWeihs, C., & Glasmachers, T.In C. Weihs, Jannach, D., Vatolkin, I., & Rudolph, G. (Eds.), Music Data Analysis: Foundations and Applications
@incollection{WeihsGlasmachers2016, author = {Weihs, C. and Glasmachers, T.}, title = {Supervised Classification}, booktitle = {Music Data Analysis: Foundations and Applications}, editor = {Weihs, C. and Jannach, D. and Vatolkin, I. and Rudolph, G.}, year = {2016}, }
Weihs, C., & Glasmachers, T.. (2016). Supervised Classification. In C. Weihs, Jannach, D., Vatolkin, I., & Rudolph, G. (Eds.), Music Data Analysis: Foundations and Applications. 
A CMAES with Multiplicative Covariance Matrix UpdatesKrause, O., & Glasmachers, T.In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)
@inproceedings{KrauseGlasmachers2015, author = {Krause, O. and Glasmachers, T.}, title = {A CMAES with Multiplicative Covariance Matrix Updates}, booktitle = {Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)}, year = {2015}, }
Krause, O., & Glasmachers, T.. (2015). A CMAES with Multiplicative Covariance Matrix Updates. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO). 
Testing Hypotheses by Regularized Maximum Mean DiscrepancyDanafar, S., Rancoita, P. M. V., Glasmachers, T., Whittingstall, K., & Schmidhuber, J.International Journal of Computer and Information Technology (IJCIT), 3(2)
@article{DanafarRancoitaGlasmachersEtAl2014, author = {Danafar, S. and Rancoita, P.M.V. and Glasmachers, T. and Whittingstall, K. and Schmidhuber, J.}, title = {Testing Hypotheses by Regularized Maximum Mean Discrepancy}, journal = {International Journal of Computer and Information Technology (IJCIT)}, volume = {3}, number = {2}, year = {2014}, }
Danafar, S., Rancoita, P. M. V., Glasmachers, T., Whittingstall, K., & Schmidhuber, J. (2014). Testing Hypotheses by Regularized Maximum Mean Discrepancy. International Journal of Computer and Information Technology (IJCIT), 3(2). 
Handling Sharp Ridges with Local Supremum TransformationsGlasmachers, T.In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)
@inproceedings{Glasmachers2014, author = {Glasmachers, T.}, title = {Handling Sharp Ridges with Local Supremum Transformations}, booktitle = {Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)}, year = {2014}, }
Glasmachers, T. (2014). Handling Sharp Ridges with Local Supremum Transformations. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO). 
Optimized Approximation Sets for Lowdimensional Benchmark Pareto FrontsGlasmachers, T.In Parallel Problem Solving from Nature (PPSN) Springer
@inproceedings{Glasmachers2014b, author = {Glasmachers, T.}, title = {Optimized Approximation Sets for Lowdimensional Benchmark Pareto Fronts}, booktitle = {Parallel Problem Solving from Nature (PPSN)}, publisher = {Springer}, year = {2014}, }
Glasmachers, T. (2014). Optimized Approximation Sets for Lowdimensional Benchmark Pareto Fronts. In Parallel Problem Solving from Nature (PPSN). Springer. 
Start Small, Grow Big  Saving Multiobjective Function EvaluationsGlasmachers, T., Naujoks, B., & Rudolph, G.In Parallel Problem Solving from Nature (PPSN) Springer
@inproceedings{GlasmachersNaujoksRudolph2014, author = {Glasmachers, T. and Naujoks, B. and Rudolph, G.}, title = {Start Small, Grow Big  Saving Multiobjective Function Evaluations}, booktitle = {Parallel Problem Solving from Nature (PPSN)}, publisher = {Springer}, year = {2014}, }
Glasmachers, T., Naujoks, B., & Rudolph, G. (2014). Start Small, Grow Big  Saving Multiobjective Function Evaluations. In Parallel Problem Solving from Nature (PPSN). Springer. 
Natural Evolution StrategiesWierstra, D., Schaul, T., Glasmachers, T., Sun, Y., Peters, J., & Schmidhuber, J.Journal of Machine Learning Research, 15, 949–980
@article{WierstraSchaulGlasmachersEtAl2014, author = {Wierstra, D. and Schaul, T. and Glasmachers, T. and Sun, Y. and Peters, J. and Schmidhuber, J.}, title = {Natural Evolution Strategies}, journal = {Journal of Machine Learning Research}, volume = {15}, pages = {949–980}, year = {2014}, }
Wierstra, D., Schaul, T., Glasmachers, T., Sun, Y., Peters, J., & Schmidhuber, J. (2014). Natural Evolution Strategies. Journal of Machine Learning Research, 15, 949–980. 
A Natural Evolution Strategy with Asynchronous Strategy UpdatesGlasmachers, T.In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)
@inproceedings{Glasmachers2013, author = {Glasmachers, T.}, title = {A Natural Evolution Strategy with Asynchronous Strategy Updates}, booktitle = {Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)}, year = {2013}, }
Glasmachers, T. (2013). A Natural Evolution Strategy with Asynchronous Strategy Updates. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO). 
The Planningahead SMO AlgorithmGlasmachers, T.arxiv.org
@techreport{Glasmachers2013b, author = {Glasmachers, T.}, title = {The Planningahead SMO Algorithm}, institution = {arxiv.org}, number = {arxiv:1305.0423v1}, year = {2013}, }
Glasmachers, T. (2013). The Planningahead SMO Algorithm (No. arxiv:1305.0423v1). arxiv.org. 
Accelerated Coordinate Descent with Adaptive Coordinate FrequenciesGlasmachers, T., & Doğan, Ü.In Proceedings of the fifth Asian Conference on Machine Learning (ACML)
@inproceedings{GlasmachersDoğan2013, author = {Glasmachers, T. and Doğan, Ü.}, title = {Accelerated Coordinate Descent with Adaptive Coordinate Frequencies}, booktitle = {Proceedings of the fifth Asian Conference on Machine Learning (ACML)}, year = {2013}, }
Glasmachers, T., & Doğan, Ü. (2013). Accelerated Coordinate Descent with Adaptive Coordinate Frequencies. In Proceedings of the fifth Asian Conference on Machine Learning (ACML). 
Approximation properties of DBNs with binary hidden units and realvalued visible unitsKrause, O., Fischer, A., Glasmachers, T., & Igel, C.In Proceedings of the International Conference on Machine Learning (ICML)
@inproceedings{KrauseFischerGlasmachersEtAl2013, author = {Krause, O. and Fischer, A. and Glasmachers, T. and Igel, C.}, title = {Approximation properties of DBNs with binary hidden units and realvalued visible units}, booktitle = {Proceedings of the International Conference on Machine Learning (ICML)}, year = {2013}, }
Krause, O., Fischer, A., Glasmachers, T., & Igel, C. (2013). Approximation properties of DBNs with binary hidden units and realvalued visible units. In Proceedings of the International Conference on Machine Learning (ICML). 
Turning Binary Largemargin Bounds into Multiclass BoundsDoğan, Ü., Glasmachers, T., & Igel, C.In ICML workshop on RKHS and kernelbased methods
@inproceedings{DoğanGlasmachersIgel2012, author = {Doğan, Ü. and Glasmachers, T. and Igel, C.}, title = {Turning Binary Largemargin Bounds into Multiclass Bounds}, booktitle = {ICML workshop on RKHS and kernelbased methods}, year = {2012}, }
Doğan, Ü., Glasmachers, T., & Igel, C. (2012). Turning Binary Largemargin Bounds into Multiclass Bounds. In ICML workshop on RKHS and kernelbased methods. 
A Note on Extending Generalization Bounds for Binary Largemargin Classifiers to Multiple ClassesDoğan, Ü., Glasmachers, T., & Igel, C.In Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases (ECMLPKDD)
@inproceedings{DoğanGlasmachersIgel2012b, author = {Doğan, Ü. and Glasmachers, T. and Igel, C.}, title = {A Note on Extending Generalization Bounds for Binary Largemargin Classifiers to Multiple Classes}, booktitle = {Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases (ECMLPKDD)}, year = {2012}, }
Doğan, Ü., Glasmachers, T., & Igel, C. (2012). A Note on Extending Generalization Bounds for Binary Largemargin Classifiers to Multiple Classes. In Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases (ECMLPKDD). 
Convergence of the IGOFlow of Isotropic Gaussian Distributions on Convex Quadratic ProblemsGlasmachers, T.In C. C. Coello, Cutello, V., Deb, K., Forrest, S., Nicosia, G., & Pavone, M. (Eds.), Parallel Problem Solving from Nature (PPSN) Springer
@inproceedings{Glasmachers2012, author = {Glasmachers, T.}, title = {Convergence of the IGOFlow of Isotropic Gaussian Distributions on Convex Quadratic Problems}, booktitle = {Parallel Problem Solving from Nature (PPSN)}, editor = {Coello, C. Coello and Cutello, V. and Deb, K. and Forrest, S. and Nicosia, G. and Pavone, M.}, publisher = {Springer}, year = {2012}, }
Glasmachers, T. (2012). Convergence of the IGOFlow of Isotropic Gaussian Distributions on Convex Quadratic Problems. In C. C. Coello, Cutello, V., Deb, K., Forrest, S., Nicosia, G., & Pavone, M. (Eds.), Parallel Problem Solving from Nature (PPSN). Springer. 
Kernel Representations for Evolving Continuous FunctionsGlasmachers, T., Koutník, J., & Schmidhuber, J.Journal of Evolutionary Intelligence, 5(3), 171–187
@article{GlasmachersKoutníkSchmidhuber2012, author = {Glasmachers, T. and Koutník, J. and Schmidhuber, J.}, title = {Kernel Representations for Evolving Continuous Functions}, journal = {Journal of Evolutionary Intelligence}, volume = {5}, number = {3}, pages = {171–187}, year = {2012}, doi = {10.1007/s120650120070y}, }
Glasmachers, T., Koutník, J., & Schmidhuber, J. (2012). Kernel Representations for Evolving Continuous Functions. Journal of Evolutionary Intelligence, 5(3), 171–187. http://doi.org/10.1007/s120650120070y 
Novelty Restarts for Evolution StrategiesCuccu, G., Gomez, F., & Glasmachers, T.In Proceedings of the IEEE Congress on Evolutionary Computation (CEC) IEEE
@inproceedings{CuccuGomezGlasmachers2011, author = {Cuccu, G. and Gomez, F. and Glasmachers, T.}, title = {Novelty Restarts for Evolution Strategies}, booktitle = {Proceedings of the IEEE Congress on Evolutionary Computation (CEC)}, publisher = {IEEE}, year = {2011}, }
Cuccu, G., Gomez, F., & Glasmachers, T.. (2011). Novelty Restarts for Evolution Strategies. In Proceedings of the IEEE Congress on Evolutionary Computation (CEC). IEEE. 
Optimal Direct Policy SearchGlasmachers, T., & Schmidhuber, J.In Proceedings of the 4th Conference on Artificial General Intelligence (AGI)
@inproceedings{GlasmachersSchmidhuber2011, author = {Glasmachers, T. and Schmidhuber, J.}, title = {Optimal Direct Policy Search}, booktitle = {Proceedings of the 4th Conference on Artificial General Intelligence (AGI)}, year = {2011}, }
Glasmachers, T., & Schmidhuber, J. (2011). Optimal Direct Policy Search. In Proceedings of the 4th Conference on Artificial General Intelligence (AGI). 
Artificial Curiosity for Autonomous Space ExplorationGraziano, V., Glasmachers, T., Schaul, T., Pape, L., Cuccu, G., Leitner, J., & Schmidhuber, J.ACTA FUTURA
@article{GrazianoGlasmachersSchaulEtAl2011, author = {Graziano, V. and Glasmachers, T. and Schaul, T. and Pape, L. and Cuccu, G. and Leitner, J. and Schmidhuber, J.}, title = {Artificial Curiosity for Autonomous Space Exploration}, journal = {ACTA FUTURA}, year = {2011}, }
Graziano, V., Glasmachers, T., Schaul, T., Pape, L., Cuccu, G., Leitner, J., & Schmidhuber, J. (2011). Artificial Curiosity for Autonomous Space Exploration. ACTA FUTURA. 
High Dimensions and Heavy Tails for Natural Evolution StrategiesSchaul, T., Glasmachers, T., & Schmidhuber, J.In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)
@inproceedings{SchaulGlasmachersSchmidhuber2011, author = {Schaul, T. and Glasmachers, T. and Schmidhuber, J.}, title = {High Dimensions and Heavy Tails for Natural Evolution Strategies}, booktitle = {Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)}, year = {2011}, }
Schaul, T., Glasmachers, T., & Schmidhuber, J. (2011). High Dimensions and Heavy Tails for Natural Evolution Strategies. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO). 
Coherence Progress: A Measure of Interestingness Based on Fixed CompressorsSchaul, T., Pape, L., Glasmachers, T., Graziano, V., & Schmidhuber, J.In Proceedings of the 4th Conference on Artificial General Intelligence (AGI)
@inproceedings{SchaulPapeGlasmachersEtAl2011, author = {Schaul, T. and Pape, L. and Glasmachers, T. and Graziano, V. and Schmidhuber, J.}, title = {Coherence Progress: A Measure of Interestingness Based on Fixed Compressors}, booktitle = {Proceedings of the 4th Conference on Artificial General Intelligence (AGI)}, year = {2011}, }
Schaul, T., Pape, L., Glasmachers, T., Graziano, V., & Schmidhuber, J. (2011). Coherence Progress: A Measure of Interestingness Based on Fixed Compressors. In Proceedings of the 4th Conference on Artificial General Intelligence (AGI). 
Universal Consistency of MultiClass Support Vector ClassificationGlasmachers, T.In Advances in Neural Information Processing Systems (NIPS)
@inproceedings{Glasmachers2010, author = {Glasmachers, T.}, title = {Universal Consistency of MultiClass Support Vector Classification}, booktitle = {Advances in Neural Information Processing Systems (NIPS)}, year = {2010}, }
Glasmachers, T. (2010). Universal Consistency of MultiClass Support Vector Classification. In Advances in Neural Information Processing Systems (NIPS). 
Maximum Likelihood Model Selection for 1Norm Soft Margin SVMs with Multiple ParametersGlasmachers, T., & Igel, C.IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(8), 1522–1528
@article{GlasmachersIgel2010, author = {Glasmachers, T. and Igel, C.}, title = {Maximum Likelihood Model Selection for 1Norm Soft Margin SVMs with Multiple Parameters}, journal = {IEEE Transactions on Pattern Analysis and Machine Intelligence}, volume = {32}, number = {8}, pages = {1522–1528}, year = {2010}, }
Glasmachers, T., & Igel, C. (2010). Maximum Likelihood Model Selection for 1Norm Soft Margin SVMs with Multiple Parameters. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(8), 1522–1528. 
A Natural Evolution Strategy for MultiObjective OptimizationGlasmachers, T., Schaul, T., & Schmidhuber, J.In Parallel Problem Solving from Nature (PPSN) Springer
@inproceedings{GlasmachersSchaulSchmidhuber2010, author = {Glasmachers, T. and Schaul, T. and Schmidhuber, J.}, title = {A Natural Evolution Strategy for MultiObjective Optimization}, booktitle = {Parallel Problem Solving from Nature (PPSN)}, publisher = {Springer}, year = {2010}, }
Glasmachers, T., Schaul, T., & Schmidhuber, J. (2010). A Natural Evolution Strategy for MultiObjective Optimization. In Parallel Problem Solving from Nature (PPSN). Springer. 
Exponential Natural Evolution StrategiesGlasmachers, T., Schaul, T., Sun, Y., Wierstra, D., & Schmidhuber, J.In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)
@inproceedings{GlasmachersSchaulSunEtAl2010, author = {Glasmachers, T. and Schaul, T. and Sun, Y. and Wierstra, D. and Schmidhuber, J.}, title = {Exponential Natural Evolution Strategies}, booktitle = {Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)}, year = {2010}, }
Glasmachers, T., Schaul, T., Sun, Y., Wierstra, D., & Schmidhuber, J. (2010). Exponential Natural Evolution Strategies. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO). 
Frontier SearchSun, Y., Glasmachers, T., Schaul, T., & Schmidhuber, J.In Proceedings of the 3rd Conference on Artificial General Intelligence (AGI)
@inproceedings{SunGlasmachersSchaulEtAl2010, author = {Sun, Y. and Glasmachers, T. and Schaul, T. and Schmidhuber, J.}, title = {Frontier Search}, booktitle = {Proceedings of the 3rd Conference on Artificial General Intelligence (AGI)}, year = {2010}, }
Sun, Y., Glasmachers, T., Schaul, T., & Schmidhuber, J. (2010). Frontier Search. In Proceedings of the 3rd Conference on Artificial General Intelligence (AGI). 
On related violating pairs for working set selection in SMO algorithmsGlasmachers, T.In M. Verleysen (Ed.), Proceedings of the 16th European Symposium on Artificial Neural Networks (ESANN) dside publications
@inproceedings{Glasmachers2008, author = {Glasmachers, T.}, title = {On related violating pairs for working set selection in SMO algorithms}, booktitle = {Proceedings of the 16th European Symposium on Artificial Neural Networks (ESANN)}, editor = {Verleysen, M.}, publisher = {dside publications}, year = {2008}, }
Glasmachers, T. (2008). On related violating pairs for working set selection in SMO algorithms. In M. Verleysen (Ed.), Proceedings of the 16th European Symposium on Artificial Neural Networks (ESANN). dside publications. 
Gradient Based Optimization of Support Vector MachinesGlasmachers, T.Doctoral thesis, Fakultät für Mathematik, RuhrUniversität Bochum, Germany
@phdthesis{Glasmachers2008b, author = {Glasmachers, T.}, title = {Gradient Based Optimization of Support Vector Machines}, school = {Fakultät für Mathematik, RuhrUniversität Bochum, Germany}, year = {2008}, }
Glasmachers, T. (2008). Gradient Based Optimization of Support Vector Machines. Doctoral thesis, Fakultät für Mathematik, RuhrUniversität Bochum, Germany. 
SecondOrder SMO Improves SVM Online and Active LearningGlasmachers, T., & Igel, C.Neural Computation, 20(2), 374–382
@article{GlasmachersIgel2008, author = {Glasmachers, T. and Igel, C.}, title = {SecondOrder SMO Improves SVM Online and Active Learning}, journal = {Neural Computation}, volume = {20}, number = {2}, pages = {374–382}, year = {2008}, }
Glasmachers, T., & Igel, C. (2008). SecondOrder SMO Improves SVM Online and Active Learning. Neural Computation, 20(2), 374–382. 
Uncertainty Handling in Model Selection for Support Vector MachinesGlasmachers, T., & Igel, C.In G. Rudolph, Jansen, T., Lucas, S., Poloni, C., & Beume, N. (Eds.), Parallel Problem Solving from Nature (PPSN) (pp. 185–194) Springer
@inproceedings{GlasmachersIgel2008b, author = {Glasmachers, T. and Igel, C.}, title = {Uncertainty Handling in Model Selection for Support Vector Machines}, booktitle = {Parallel Problem Solving from Nature (PPSN)}, editor = {Rudolph, G. and Jansen, T. and Lucas, S. and Poloni, C. and Beume, N.}, pages = {185–194}, publisher = {Springer}, year = {2008}, }
Glasmachers, T., & Igel, C. (2008). Uncertainty Handling in Model Selection for Support Vector Machines. In G. Rudolph, Jansen, T., Lucas, S., Poloni, C., & Beume, N. (Eds.), Parallel Problem Solving from Nature (PPSN) (pp. 185–194). Springer. 
SharkIgel, C., HeidrichMeisner, V., & Glasmachers, T.Journal of Machine Learning Research, 9, 993–996
@article{IgelHeidrichMeisnerGlasmachers2008, author = {Igel, C. and HeidrichMeisner, V. and Glasmachers, T.}, title = {Shark}, journal = {Journal of Machine Learning Research}, volume = {9}, pages = {993–996}, year = {2008}, }
Igel, C., HeidrichMeisner, V., & Glasmachers, T.. (2008). Shark. Journal of Machine Learning Research, 9, 993–996. 
GradientBased Optimization of KernelTarget Alignment for Sequence Kernels Applied to Bacterial Gene Start DetectionIgel, C., Glasmachers, T., Mersch, B., Pfeifer, N., & Meinicke, P.IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB), 4(2), 216–226
@article{IgelGlasmachersMerschEtAl2007, author = {Igel, C. and Glasmachers, T. and Mersch, B. and Pfeifer, N. and Meinicke, P.}, title = {GradientBased Optimization of KernelTarget Alignment for Sequence Kernels Applied to Bacterial Gene Start Detection}, journal = {IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)}, volume = {4}, number = {2}, pages = {216–226}, year = {2007}, }
Igel, C., Glasmachers, T., Mersch, B., Pfeifer, N., & Meinicke, P. (2007). GradientBased Optimization of KernelTarget Alignment for Sequence Kernels Applied to Bacterial Gene Start Detection. IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB), 4(2), 216–226. 
Evolutionary Optimization of Sequence Kernels for Detection of Bacterial Gene StartsMersch, B., Glasmachers, T., Meinicke, P., & Igel, C.International Journal of Neural Systems, 17(5), 369–381
@article{MerschGlasmachersMeinickeEtAl2007, author = {Mersch, B. and Glasmachers, T. and Meinicke, P. and Igel, C.}, title = {Evolutionary Optimization of Sequence Kernels for Detection of Bacterial Gene Starts}, journal = {International Journal of Neural Systems}, volume = {17}, number = {5}, pages = {369–381}, year = {2007}, }
Mersch, B., Glasmachers, T., Meinicke, P., & Igel, C. (2007). Evolutionary Optimization of Sequence Kernels for Detection of Bacterial Gene Starts. International Journal of Neural Systems, 17(5), 369–381. 
Degeneracy in Model Selection for SVMs with Radial Gaussian KernelGlasmachers, T.In M. Verleysen (Ed.), Proceedings of the 14th European Symposium on Artificial Neural Networks (ESANN) dside publications
@inproceedings{Glasmachers2006, author = {Glasmachers, T.}, title = {Degeneracy in Model Selection for SVMs with Radial Gaussian Kernel}, booktitle = {Proceedings of the 14th European Symposium on Artificial Neural Networks (ESANN)}, editor = {Verleysen, M.}, publisher = {dside publications}, year = {2006}, }
Glasmachers, T. (2006). Degeneracy in Model Selection for SVMs with Radial Gaussian Kernel. In M. Verleysen (Ed.), Proceedings of the 14th European Symposium on Artificial Neural Networks (ESANN). dside publications. 
MaximumGain Working Set Selection for Support Vector MachinesGlasmachers, T., & Igel, C.Journal of Machine Learning Research, 7, 1437–1466
@article{GlasmachersIgel2006, author = {Glasmachers, T. and Igel, C.}, title = {MaximumGain Working Set Selection for Support Vector Machines}, journal = {Journal of Machine Learning Research}, volume = {7}, pages = {1437–1466}, year = {2006}, }
Glasmachers, T., & Igel, C. (2006). MaximumGain Working Set Selection for Support Vector Machines. Journal of Machine Learning Research, 7, 1437–1466. 
Evolutionary Optimization of Sequence Kernels for Detection of Bacterial Gene StartsMersch, B., Glasmachers, T., Meinicke, P., & Igel, C.In Proceedings of the 16th International Conference on Artificial Neural Networks (ICANN) SpringerVerlag
@inproceedings{MerschGlasmachersMeinickeEtAl2006, author = {Mersch, B. and Glasmachers, T. and Meinicke, P. and Igel, C.}, title = {Evolutionary Optimization of Sequence Kernels for Detection of Bacterial Gene Starts}, booktitle = {Proceedings of the 16th International Conference on Artificial Neural Networks (ICANN)}, publisher = {SpringerVerlag}, year = {2006}, }
Mersch, B., Glasmachers, T., Meinicke, P., & Igel, C. (2006). Evolutionary Optimization of Sequence Kernels for Detection of Bacterial Gene Starts. In Proceedings of the 16th International Conference on Artificial Neural Networks (ICANN). SpringerVerlag. 
Gradientbased Adaptation of General Gaussian KernelsGlasmachers, T., & Igel, C.Neural Computation, 17(10), 2099–2105
@article{GlasmachersIgel2005, author = {Glasmachers, T. and Igel, C.}, title = {Gradientbased Adaptation of General Gaussian Kernels}, journal = {Neural Computation}, volume = {17}, number = {10}, pages = {2099–2105}, year = {2005}, }
Glasmachers, T., & Igel, C. (2005). Gradientbased Adaptation of General Gaussian Kernels. Neural Computation, 17(10), 2099–2105.
2018
2017
2016
2015
2014
2013
2012
2011
2010
2008
2007
2006
2005
Summer Term 2018
Lectures  Machine Learning: Supervised Methods 
Seminars  Master Seminar Supervised Methods 
Winter Term 2017/2018
Lectures  Machine Learning: Evolutionary Algorithms 
Summer Term 2017
Lectures  Machine Learning: Supervised Methods 
Winter Term 2016/2017
Lectures  Machine Learning: Evolutionary Algorithms 
I am offering Master theses in the areas of machine learning and optimization. Prerequisites:
 completed at least one of my lectures
 programming skills and/or solid mathematical background
Please contact me for details and for currently open topics.

Entwicklung eines erweiterbaren grafischen Frameworks für KlassifikationsalgorithmenBönnighausen, N.Bachelor's thesis, Applied Informatics, Univ. of Bochum, Germany
@bachelorthesis{Bönnighausen2018, author = {Bönnighausen, Nils}, title = {Entwicklung eines erweiterbaren grafischen Frameworks für Klassifikationsalgorithmen}, school = {Applied Informatics, Univ. of Bochum, Germany}, month = {February}, year = {2018}, }
Bönnighausen, N. (2018, February). Entwicklung eines erweiterbaren grafischen Frameworks für Klassifikationsalgorithmen. Bachelor's thesis, Applied Informatics, Univ. of Bochum, Germany. 
Tuning tiefer neuronaler Netze mit automatischen ParameterOptimierernKalus, C.Bachelor's thesis, Applied Informatics, Univ. of Bochum, Germany
@mastersthesis{Kalus2017, author = {Kalus, Christopher}, title = {Tuning tiefer neuronaler Netze mit automatischen ParameterOptimierern}, school = {Applied Informatics, Univ. of Bochum, Germany}, month = {December}, year = {2017}, }
Kalus, C. (2017, December). Tuning tiefer neuronaler Netze mit automatischen ParameterOptimierern. Bachelor's thesis, Applied Informatics, Univ. of Bochum, Germany. 
Material Recognition with Convolutional Neural NetworksBulinski, P.Bachelor's thesis, Applied Informatics, Univ. of Bochum, Germany
@mastersthesis{Bulinski2017, author = {Bulinski, Patrick}, title = {Material Recognition with Convolutional Neural Networks}, school = {Applied Informatics, Univ. of Bochum, Germany}, month = {November}, year = {2017}, }
Bulinski, P. (2017, November). Material Recognition with Convolutional Neural Networks. Bachelor's thesis, Applied Informatics, Univ. of Bochum, Germany. 
Implementation and Evaluation of Ensemble Support Vector MachinesKalinski, D.Master’s thesis, Applied Informatics, Univ. of Bochum, Germany
@mastersthesis{Kalinski2016, author = {Kalinski, Dawid}, title = {Implementation and Evaluation of Ensemble Support Vector Machines}, school = {Applied Informatics, Univ. of Bochum, Germany}, month = {October}, year = {2016}, }
Kalinski, D. (2016, October). Implementation and Evaluation of Ensemble Support Vector Machines. Master’s thesis, Applied Informatics, Univ. of Bochum, Germany. 
Training linear SupportVektorMaschinen mit einem TrustRegion Newton AlgrothmusMarek, P.Bachelor's thesis, Applied Informatics, Univ. of Bochum, Germany
@mastersthesis{Marek2016, author = {Marek, Phil}, title = {Training linear SupportVektorMaschinen mit einem TrustRegion Newton Algrothmus}, school = {Applied Informatics, Univ. of Bochum, Germany}, month = {March}, year = {2016}, }
Marek, P. (2016, March). Training linear SupportVektorMaschinen mit einem TrustRegion Newton Algrothmus. Bachelor's thesis, Applied Informatics, Univ. of Bochum, Germany. 
Ensembles von SupportVektorMaschinen mit BudgetZhang, M.Master’s thesis, Applied Informatics, Univ. of Bochum, Germany
@mastersthesis{Zhang2015, author = {Zhang, Minghui}, title = {Ensembles von SupportVektorMaschinen mit Budget}, school = {Applied Informatics, Univ. of Bochum, Germany}, month = {July }, year = {2015}, }
Zhang, M. (2015, July). Ensembles von SupportVektorMaschinen mit Budget. Master’s thesis, Applied Informatics, Univ. of Bochum, Germany. 
Evaluation einer Evolutionsstrategie zur Optimierung unter NebenbedingungenVonk, D.Bachelor's thesis, Applied Informatics, Univ. of Bochum, Germany
@mastersthesis{Vonk2014, author = {Vonk, Daniel}, title = {Evaluation einer Evolutionsstrategie zur Optimierung unter Nebenbedingungen}, school = {Applied Informatics, Univ. of Bochum, Germany}, month = {May}, year = {2014}, }
Vonk, D. (2014, May). Evaluation einer Evolutionsstrategie zur Optimierung unter Nebenbedingungen. Bachelor's thesis, Applied Informatics, Univ. of Bochum, Germany. 
OnlineAnpassung von Lernraten im stochastischen Gradientenabstieg zum Training neuronaler NetzeSosna, S.Bachelor's thesis, Applied Informatics, Univ. of Bochum, Germany
@mastersthesis{Sosna2014, author = {Sosna, Sebastian}, title = {OnlineAnpassung von Lernraten im stochastischen Gradientenabstieg zum Training neuronaler Netze}, school = {Applied Informatics, Univ. of Bochum, Germany}, month = {April}, year = {2014}, }
Sosna, S. (2014, April). OnlineAnpassung von Lernraten im stochastischen Gradientenabstieg zum Training neuronaler Netze. Bachelor's thesis, Applied Informatics, Univ. of Bochum, Germany. 
Empirische Analyse direkter randomisierter Suchverfahren mit variabler Metrik und KonvergenzgarantieScheller, T.Bachelor's thesis, Applied Informatics, Univ. of Bochum, Germany
@mastersthesis{Scheller2013, author = {Scheller, Thorsten}, title = {Empirische Analyse direkter randomisierter Suchverfahren mit variabler Metrik und Konvergenzgarantie}, school = {Applied Informatics, Univ. of Bochum, Germany}, month = {September}, year = {2013}, }
Scheller, T. (2013, September). Empirische Analyse direkter randomisierter Suchverfahren mit variabler Metrik und Konvergenzgarantie. Bachelor's thesis, Applied Informatics, Univ. of Bochum, Germany. 
Constraint Handling in Evolution Strategies Applied to Trajectory PlanningStamm, G.Bachelor's thesis, ETIT Dept., Univ. of Bochum, Germany
@mastersthesis{Stamm2013, author = {Stamm, Gennadi}, title = {Constraint Handling in Evolution Strategies Applied to Trajectory Planning}, school = {ETIT Dept., Univ. of Bochum, Germany}, month = {January}, year = {2013}, }
Stamm, G. (2013, January). Constraint Handling in Evolution Strategies Applied to Trajectory Planning. Bachelor's thesis, ETIT Dept., Univ. of Bochum, Germany.
2018
2017
2016
2015
2014
2013
The Institut für Neuroinformatik (INI) is a central research unit of the RuhrUniversität Bochum. We aim to understand the fundamental principles through which organisms generate behavior and cognition while linked to their environments through sensory systems and while acting in those environments through effector systems. Inspired by our insights into such natural cognitive systems, we seek new solutions to problems of information processing in artificial cognitive systems. We draw from a variety of disciplines that include experimental approaches from psychology and neurophysiology as well as theoretical approaches from physics, mathematics, electrical engineering and applied computer science.
Universitätsstr. 150, Building NB, Room 3/32
D44801 Bochum, Germany
Tel: (+49) 234 3228967
Fax: (+49) 234 3214210