Optimization of Adaptive Systems


The optimization of adaptive systems workgroup is concerned with the design and analysis of adaptive information processing systems. We are interested in systems that improve over time through learning, self-adaptation, and evolution. Our systems improve autonomously based on data, in contrast to manual instruction or programming.

Parameter tuning for a support vector machine

Currently we are working on

  • fast training and model selection for support vector machines
  • learning in deep networks
  • development of new evolutionary search algorithms and their analysis
  • open source implementation of a large number of machine learning algorithms

We offer theoretical and practical advice in machine learning and computational intelligence to other research groups and industrial partners.

Software: Shark

We provide and maintain a fast, modular, open source C++ library for the design and optimization of adaptive systems. It provides a diverse set of machine learning algorithms as well as methods for non-linear optimization. The Shark library works under SunOS/Linux, Mac OS X, and Windows. Shark can be downloaded from shark-ml.org.

Former Group Members

Glasmachers, T. (2017). A Fast Incremental BSP Tree Archive for Non-dominated Points. In Evolutionary Multi-Criterion Optimization (EMO). Springer.
Irmer, T., Glasmachers, T., & Maji, S. (2017). Texture attribute synthesis and transfer using feed-forward CNNs. In Winter Conference on Applications of Computer Vision (WACV). IEEE.
Krause, O., Glasmachers, T., & Igel, C. (2017). Qualitative and Quantitative Assessment of Step Size Adaptation Rules. In Conference on Foundations of Genetic Algorithms (FOGA). ACM.
Doğan, Ü., Glasmachers, T., & Igel, C. (2016). A Unified View on Multi-class Support Vector Classification. Journal of Machine Learning Research, 17(45), 1–32.
Glasmachers, T. (2016). Finite Sum Acceleration vs. Adaptive Learning Rates for the Training of Kernel Machines on a Budget. In NIPS workshop on Optimization for Machine Learning.
Glasmachers, T. (2016). Small Stochastic Average Gradient Steps. In NIPS workshop on Optimizing the Optimizers.
Horn, D., Demirciğlu, A., Bischl, B., Glasmachers, T., & Weihs, C. (2016). A Comparative Study on Large Scale Kernelized Support Vector Machines. Advances in Data Analysis and Classification (ADAC), 1–17.
Krause, O., Glasmachers, T., Hansen, N., & Igel, C. (2016). Unbounded Population MO-CMA-ES for the Bi-Objective BBOB Test Suite. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO).
Krause, O., Glasmachers, T., & Igel, C. (2016). Multi-objective Optimization with Unbounded Solution Sets. In NIPS workshop on Bayesian Optimization.
Loshchilov, I., & Glasmachers, T.. (2016). Anytime Bi-Objective Optimization with a Hybrid Multi-Objective CMA-ES (HMO-CMA-ES). In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO).
Weihs, C., & Glasmachers, T.. (2016). Supervised Classification. In C. Weihs, Jannach, D., Vatolkin, I., & Rudolph, G. (Eds.), Music Data Analysis: Foundations and Applications.
Krause, O., & Glasmachers, T.. (2015). A CMA-ES with Multiplicative Covariance Matrix Updates. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO).
Danafar, S., Rancoita, P. M. V., Glasmachers, T., Whittingstall, K., & Schmidhuber, J. (2014). Testing Hypotheses by Regularized Maximum Mean Discrepancy. International Journal of Computer and Information Technology (IJCIT), 3(2).
Glasmachers, T. (2014). Handling Sharp Ridges with Local Supremum Transformations. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO).
Glasmachers, T. (2014). Optimized Approximation Sets for Low-dimensional Benchmark Pareto Fronts. In Parallel Problem Solving from Nature (PPSN). Springer.
Glasmachers, T., Naujoks, B., & Rudolph, G. (2014). Start Small, Grow Big - Saving Multiobjective Function Evaluations. In Parallel Problem Solving from Nature (PPSN). Springer.
Wierstra, D., Schaul, T., Glasmachers, T., Sun, Y., Peters, J., & Schmidhuber, J. (2014). Natural Evolution Strategies. Journal of Machine Learning Research, 15, 949–980.
Glasmachers, T. (2013). A Natural Evolution Strategy with Asynchronous Strategy Updates. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO).
Glasmachers, T. (2013). The Planning-ahead SMO Algorithm (No. arxiv:1305.0423v1). arxiv.org.
Glasmachers, T., & Doğan, Ü. (2013). Accelerated Coordinate Descent with Adaptive Coordinate Frequencies. In Proceedings of the fifth Asian Conference on Machine Learning (ACML).
Krause, O., Fischer, A., Glasmachers, T., & Igel, C. (2013). Approximation properties of DBNs with binary hidden units and real-valued visible units. In Proceedings of the International Conference on Machine Learning (ICML).
Doğan, Ü., Glasmachers, T., & Igel, C. (2012). Turning Binary Large-margin Bounds into Multi-class Bounds. In ICML workshop on RKHS and kernel-based methods.
Doğan, Ü., Glasmachers, T., & Igel, C. (2012). A Note on Extending Generalization Bounds for Binary Large-margin Classifiers to Multiple Classes. In Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases (ECML-PKDD).
Glasmachers, T. (2012). Convergence of the IGO-Flow of Isotropic Gaussian Distributions on Convex Quadratic Problems. In C. C. Coello, Cutello, V., Deb, K., Forrest, S., Nicosia, G., & Pavone, M. (Eds.), Parallel Problem Solving from Nature (PPSN). Springer.
Glasmachers, T., Koutník, J., & Schmidhuber, J. (2012). Kernel Representations for Evolving Continuous Functions. Journal of Evolutionary Intelligence, 5(3), 171–187. http://doi.org/10.1007/s12065-012-0070-y
Cuccu, G., Gomez, F., & Glasmachers, T.. (2011). Novelty Restarts for Evolution Strategies. In Proceedings of the IEEE Congress on Evolutionary Computation (CEC). IEEE.
Cuccu, G., Gomez, F., & Glasmachers, T.. (2011). Novelty Restarts for Evolution Strategies. In Proceedings of the IEEE Congress on Evolutionary Computation (CEC). IEEE.
Doğan, Ü., Glasmachers, T., & Igel, C. (2011). Fast Training of Multi-Class Support Vector Machines (No. 2011/3). Department of Computer Science, University of Copenhagen.
Glasmachers, T., & Schmidhuber, J. (2011). Optimal Direct Policy Search. In Proceedings of the 4th Conference on Artificial General Intelligence (AGI).
Graziano, V., Glasmachers, T., Schaul, T., Pape, L., Cuccu, G., Leitner, J., & Schmidhuber, J. (2011). Artificial Curiosity for Autonomous Space Exploration. ACTA FUTURA.
Schaul, T., Glasmachers, T., & Schmidhuber, J. (2011). High Dimensions and Heavy Tails for Natural Evolution Strategies. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO).
Schaul, T., Pape, L., Glasmachers, T., Graziano, V., & Schmidhuber, J. (2011). Coherence Progress: A Measure of Interestingness Based on Fixed Compressors. In Proceedings of the 4th Conference on Artificial General Intelligence (AGI).
Glasmachers, T. (2010). Universal Consistency of Multi-Class Support Vector Classification. In Advances in Neural Information Processing Systems (NIPS).
Glasmachers, T., & Igel, C. (2010). Maximum Likelihood Model Selection for 1-Norm Soft Margin SVMs with Multiple Parameters. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(8), 1522–1528.
Glasmachers, T., Schaul, T., & Schmidhuber, J. (2010). A Natural Evolution Strategy for Multi-Objective Optimization. In Parallel Problem Solving from Nature (PPSN). Springer.
Glasmachers, T., Schaul, T., Sun, Y., Wierstra, D., & Schmidhuber, J. (2010). Exponential Natural Evolution Strategies. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO).
Sun, Y., Glasmachers, T., Schaul, T., & Schmidhuber, J. (2010). Frontier Search. In Proceedings of the 3rd Conference on Artificial General Intelligence (AGI).
Glasmachers, T. (2008). On related violating pairs for working set selection in SMO algorithms. In M. Verleysen (Ed.), Proceedings of the 16th European Symposium on Artificial Neural Networks (ESANN). d-side publications.
Glasmachers, T. (2008). Gradient Based Optimization of Support Vector Machines. Doctoral thesis, Fakultät für Mathematik, Ruhr-Universität Bochum, Germany.
Glasmachers, T., & Igel, C. (2008). Second-Order SMO Improves SVM Online and Active Learning. Neural Computation, 20(2), 374–382.
Glasmachers, T., & Igel, C. (2008). Uncertainty Handling in Model Selection for Support Vector Machines. In G. Rudolph, Jansen, T., Lucas, S., Poloni, C., & Beume, N. (Eds.), Parallel Problem Solving from Nature (PPSN) (pp. 185–194). Springer.
Igel, C., Heidrich-Meisner, V., & Glasmachers, T.. (2008). Shark. Journal of Machine Learning Research, 9, 993–996.
Igel, C., Glasmachers, T., Mersch, B., Pfeifer, N., & Meinicke, P. (2007). Gradient-Based Optimization of Kernel-Target Alignment for Sequence Kernels Applied to Bacterial Gene Start Detection. IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB), 4(2), 216–226.
Mersch, B., Glasmachers, T., Meinicke, P., & Igel, C. (2007). Evolutionary Optimization of Sequence Kernels for Detection of Bacterial Gene Starts. International Journal of Neural Systems, 17(5), 369–381.
Glasmachers, T. (2006). Degeneracy in Model Selection for SVMs with Radial Gaussian Kernel. In M. Verleysen (Ed.), Proceedings of the 14th European Symposium on Artificial Neural Networks (ESANN). d-side publications.
Glasmachers, T., & Igel, C. (2006). Maximum-Gain Working Set Selection for Support Vector Machines. Journal of Machine Learning Research, 7, 1437–1466.
Mersch, B., Glasmachers, T., Meinicke, P., & Igel, C. (2006). Evolutionary Optimization of Sequence Kernels for Detection of Bacterial Gene Starts. In Proceedings of the 16th International Conference on Artificial Neural Networks (ICANN). Springer-Verlag.
Glasmachers, T., & Igel, C. (2005). Gradient-based Adaptation of General Gaussian Kernels. Neural Computation, 17(10), 2099–2105.