Optimization of Adaptive Systems


The optimization of adaptive systems workgroup is concerned with the design and analysis of adaptive information processing systems. We are interested in systems that improve over time through learning, self-adaptation, and evolution. Our systems improve autonomously based on data, in contrast to manual instruction or programming.

Parameter tuning for a support vector machine

Currently we are working on

  • fast training and model selection for support vector machines
  • learning in deep networks
  • development of new evolutionary search algorithms and their analysis
  • open source implementation of a large number of machine learning algorithms

We offer theoretical and practical advice in machine learning and computational intelligence to other research groups and industrial partners.

Software: Shark

We provide and maintain a fast, modular, open source C++ library for the design and optimization of adaptive systems. It provides a diverse set of machine learning algorithms as well as methods for non-linear optimization. The Shark library works under SunOS/Linux, Mac OS X, and Windows. Shark can be downloaded from shark-ml.org.

Former Group Members

    2018

  • Drift Theory in Continuous Search Spaces: Expected Hitting Time of the (1+1)-ES with 1/5 Success Rule
    Akimoto, Y., Auger, A., & Glasmachers, T.
    In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO) ACM
  • 2017

  • A Fast Incremental BSP Tree Archive for Non-dominated Points
    Glasmachers, T.
    In Evolutionary Multi-Criterion Optimization (EMO) Springer
  • Limits of End-to-End Learning
    Glasmachers, T.
    In Proceedings of the 9th Asian Conference on Machine Learning (ACML)
  • Global Convergence of the (1+1) Evolution Strategy
    Glasmachers, T.
    arxiv.org
  • Texture attribute synthesis and transfer using feed-forward CNNs
    Irmer, T., Glasmachers, T., & Maji, S.
    In Winter Conference on Applications of Computer Vision (WACV) IEEE
  • Qualitative and Quantitative Assessment of Step Size Adaptation Rules
    Krause, O., Glasmachers, T., & Igel, C.
    In Conference on Foundations of Genetic Algorithms (FOGA) ACM
  • 2016

  • Fast model selection by limiting SVM training times
    Demircioğlu, A., Horn, D., Glasmachers, T., Bischl, B., & Weihs, C.
    arxiv.org
  • A Unified View on Multi-class Support Vector Classification
    Doğan, Ü., Glasmachers, T., & Igel, C.
    Journal of Machine Learning Research, 17(45), 1–32
  • Finite Sum Acceleration vs. Adaptive Learning Rates for the Training of Kernel Machines on a Budget
    Glasmachers, T.
    In NIPS workshop on Optimization for Machine Learning
  • Small Stochastic Average Gradient Steps
    Glasmachers, T.
    In NIPS workshop on Optimizing the Optimizers
  • A Comparative Study on Large Scale Kernelized Support Vector Machines
    Horn, D., Demirciğlu, A., Bischl, B., Glasmachers, T., & Weihs, C.
    Advances in Data Analysis and Classification (ADAC), 1–17
  • Unbounded Population MO-CMA-ES for the Bi-Objective BBOB Test Suite
    Krause, O., Glasmachers, T., Hansen, N., & Igel, C.
    In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)
  • Multi-objective Optimization with Unbounded Solution Sets
    Krause, O., Glasmachers, T., & Igel, C.
    In NIPS workshop on Bayesian Optimization
  • Anytime Bi-Objective Optimization with a Hybrid Multi-Objective CMA-ES (HMO-CMA-ES)
    Loshchilov, I., & Glasmachers, T.
    In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)
  • Supervised Classification
    Weihs, C., & Glasmachers, T.
    In C. Weihs, Jannach, D., Vatolkin, I., & Rudolph, G. (Eds.), Music Data Analysis: Foundations and Applications
  • 2015

  • A CMA-ES with Multiplicative Covariance Matrix Updates
    Krause, O., & Glasmachers, T.
    In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)
  • 2014

  • Testing Hypotheses by Regularized Maximum Mean Discrepancy
    Danafar, S., Rancoita, P. M. V., Glasmachers, T., Whittingstall, K., & Schmidhuber, J.
    International Journal of Computer and Information Technology (IJCIT), 3(2)
  • Handling Sharp Ridges with Local Supremum Transformations
    Glasmachers, T.
    In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)
  • Optimized Approximation Sets for Low-dimensional Benchmark Pareto Fronts
    Glasmachers, T.
    In Parallel Problem Solving from Nature (PPSN) Springer
  • Start Small, Grow Big - Saving Multiobjective Function Evaluations
    Glasmachers, T., Naujoks, B., & Rudolph, G.
    In Parallel Problem Solving from Nature (PPSN) Springer
  • Natural Evolution Strategies
    Wierstra, D., Schaul, T., Glasmachers, T., Sun, Y., Peters, J., & Schmidhuber, J.
    Journal of Machine Learning Research, 15, 949–980
  • 2013

  • A Natural Evolution Strategy with Asynchronous Strategy Updates
    Glasmachers, T.
    In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)
  • The Planning-ahead SMO Algorithm
    Glasmachers, T.
    arxiv.org
  • Accelerated Coordinate Descent with Adaptive Coordinate Frequencies
    Glasmachers, T., & Doğan, Ü.
    In Proceedings of the fifth Asian Conference on Machine Learning (ACML)
  • Approximation properties of DBNs with binary hidden units and real-valued visible units
    Krause, O., Fischer, A., Glasmachers, T., & Igel, C.
    In Proceedings of the International Conference on Machine Learning (ICML)
  • 2012

  • Turning Binary Large-margin Bounds into Multi-class Bounds
    Doğan, Ü., Glasmachers, T., & Igel, C.
    In ICML workshop on RKHS and kernel-based methods
  • A Note on Extending Generalization Bounds for Binary Large-margin Classifiers to Multiple Classes
    Doğan, Ü., Glasmachers, T., & Igel, C.
    In Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases (ECML-PKDD)
  • Convergence of the IGO-Flow of Isotropic Gaussian Distributions on Convex Quadratic Problems
    Glasmachers, T.
    In C. C. Coello, Cutello, V., Deb, K., Forrest, S., Nicosia, G., & Pavone, M. (Eds.), Parallel Problem Solving from Nature (PPSN) Springer
  • Kernel Representations for Evolving Continuous Functions
    Glasmachers, T., Koutník, J., & Schmidhuber, J.
    Journal of Evolutionary Intelligence, 5(3), 171–187
  • 2011

  • Novelty Restarts for Evolution Strategies
    Cuccu, G., Gomez, F., & Glasmachers, T.
    In Proceedings of the IEEE Congress on Evolutionary Computation (CEC) IEEE
  • Optimal Direct Policy Search
    Glasmachers, T., & Schmidhuber, J.
    In Proceedings of the 4th Conference on Artificial General Intelligence (AGI)
  • Artificial Curiosity for Autonomous Space Exploration
    Graziano, V., Glasmachers, T., Schaul, T., Pape, L., Cuccu, G., Leitner, J., & Schmidhuber, J.
    ACTA FUTURA
  • High Dimensions and Heavy Tails for Natural Evolution Strategies
    Schaul, T., Glasmachers, T., & Schmidhuber, J.
    In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)
  • Coherence Progress: A Measure of Interestingness Based on Fixed Compressors
    Schaul, T., Pape, L., Glasmachers, T., Graziano, V., & Schmidhuber, J.
    In Proceedings of the 4th Conference on Artificial General Intelligence (AGI)
  • 2010

  • Universal Consistency of Multi-Class Support Vector Classification
    Glasmachers, T.
    In Advances in Neural Information Processing Systems (NIPS)
  • Maximum Likelihood Model Selection for 1-Norm Soft Margin SVMs with Multiple Parameters
    Glasmachers, T., & Igel, C.
    IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(8), 1522–1528
  • A Natural Evolution Strategy for Multi-Objective Optimization
    Glasmachers, T., Schaul, T., & Schmidhuber, J.
    In Parallel Problem Solving from Nature (PPSN) Springer
  • Exponential Natural Evolution Strategies
    Glasmachers, T., Schaul, T., Sun, Y., Wierstra, D., & Schmidhuber, J.
    In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)
  • Frontier Search
    Sun, Y., Glasmachers, T., Schaul, T., & Schmidhuber, J.
    In Proceedings of the 3rd Conference on Artificial General Intelligence (AGI)
  • 2008

  • On related violating pairs for working set selection in SMO algorithms
    Glasmachers, T.
    In M. Verleysen (Ed.), Proceedings of the 16th European Symposium on Artificial Neural Networks (ESANN) d-side publications
  • Gradient Based Optimization of Support Vector Machines
    Glasmachers, T.
    Doctoral thesis, Fakultät für Mathematik, Ruhr-Universität Bochum, Germany
  • Second-Order SMO Improves SVM Online and Active Learning
    Glasmachers, T., & Igel, C.
    Neural Computation, 20(2), 374–382
  • Uncertainty Handling in Model Selection for Support Vector Machines
    Glasmachers, T., & Igel, C.
    In G. Rudolph, Jansen, T., Lucas, S., Poloni, C., & Beume, N. (Eds.), Parallel Problem Solving from Nature (PPSN) (pp. 185–194) Springer
  • Shark
    Igel, C., Heidrich-Meisner, V., & Glasmachers, T.
    Journal of Machine Learning Research, 9, 993–996
  • 2007

  • Gradient-Based Optimization of Kernel-Target Alignment for Sequence Kernels Applied to Bacterial Gene Start Detection
    Igel, C., Glasmachers, T., Mersch, B., Pfeifer, N., & Meinicke, P.
    IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB), 4(2), 216–226
  • Evolutionary Optimization of Sequence Kernels for Detection of Bacterial Gene Starts
    Mersch, B., Glasmachers, T., Meinicke, P., & Igel, C.
    International Journal of Neural Systems, 17(5), 369–381
  • 2006

  • Degeneracy in Model Selection for SVMs with Radial Gaussian Kernel
    Glasmachers, T.
    In M. Verleysen (Ed.), Proceedings of the 14th European Symposium on Artificial Neural Networks (ESANN) d-side publications
  • Maximum-Gain Working Set Selection for Support Vector Machines
    Glasmachers, T., & Igel, C.
    Journal of Machine Learning Research, 7, 1437–1466
  • Evolutionary Optimization of Sequence Kernels for Detection of Bacterial Gene Starts
    Mersch, B., Glasmachers, T., Meinicke, P., & Igel, C.
    In Proceedings of the 16th International Conference on Artificial Neural Networks (ICANN) Springer-Verlag
  • 2005

  • Gradient-based Adaptation of General Gaussian Kernels
    Glasmachers, T., & Igel, C.
    Neural Computation, 17(10), 2099–2105

The Institut für Neuroinformatik (INI) is a central research unit of the Ruhr-Universität Bochum. We aim to understand the fundamental principles through which organisms generate behavior and cognition while linked to their environments through sensory systems and while acting in those environments through effector systems. Inspired by our insights into such natural cognitive systems, we seek new solutions to problems of information processing in artificial cognitive systems. We draw from a variety of disciplines that include experimental approaches from psychology and neurophysiology as well as theoretical approaches from physics, mathematics, electrical engineering and applied computer science.

Universitätsstr. 150, Building NB, Room 3/32
D-44801 Bochum, Germany

Tel: (+49) 234 32-28967
Fax: (+49) 234 32-14210