- RUB
- INI
- Research Groups
- Theory of Machine Learning
Theory of Machine Learning

Creating Autonomous Agents
A long-standing goal of reinforcement learning is to create truly autonomous systems, e.g., in robotics and in virtual environments. We are working towards this vision. To this end we aim to equip deep reinforcement learning systems with additional structure, e.g., for efficient navigation and object-oriented actions.
Machine Learning Applications and Transfer
We have several ongoing projects aiming to transfer machine learning as a technology into different application areas, in academia as well as in industry. In these activities we keep an eye on problems of general interest, like transfer learning, automated machine learning, long-term maintainability and human factors.
Optimization
Optimization is underlying most of machine learning. It is used for training models, but also for tuning hyper-parameters and for automated model selection. Complementing gradient-based methods, we pursue research in the area of evolutionary optimization, where we are interested in algorithm design and provable performance guarantees.
-
Application of Reinforcement Learning to a Mining SystemFidencio, A., Naro, D., & Glasmachers, T.In 19th IEEE World Symposium on Applied Machine Intelligence and Informatics (SAMI′2021)
@inproceedings{FidencioNaroGlasmachers2021, author = {Fidencio, Aline and Naro, Daniele and Glasmachers, Tobias}, title = {Application of Reinforcement Learning to a Mining System}, booktitle = {19th IEEE World Symposium on Applied Machine Intelligence and Informatics (SAMI′2021)}, year = {2021}, }
Fidencio, A., Naro, D., & Glasmachers, T.. (2021). Application of Reinforcement Learning to a Mining System. In 19th IEEE World Symposium on Applied Machine Intelligence and Informatics (SAMI′2021).2020
Improving the performance of EEG decoding using anchored-STFT in conjunction with gradient norm adversarial augmentationAli, O., Saif-ur-Rehman, M., Dyck, S., Glasmachers, T., Iossifidis, I., & Klaes, C.arXiv.org@techreport{AliSaif-ur-RehmanDyckEtAl2020, author = {Ali, Omair and Saif-ur-Rehman, Muhammad and Dyck, Susanne and Glasmachers, Tobias and Iossifidis, Ioannis and Klaes, Christian}, title = {Improving the performance of EEG decoding using anchored-STFT in conjunction with gradient norm adversarial augmentation}, institution = {arXiv.org}, number = {2011.14694}, year = {2020}, }
Ali, O., Saif-ur-Rehman, M., Dyck, S., Glasmachers, T., Iossifidis, I., & Klaes, C. (2020). Improving the performance of EEG decoding using anchored-STFT in conjunction with gradient norm adversarial augmentation (No. 2011.14694). arXiv.org.The Hessian Estimation Evolution StrategyGlasmachers, T., & Krause, O.In Parallel Problem Solving from Nature (PPSN XVII) Springer@inproceedings{GlasmachersKrause2020, author = {Glasmachers, Tobias and Krause, Oswin}, title = {The Hessian Estimation Evolution Strategy}, booktitle = {Parallel Problem Solving from Nature (PPSN XVII)}, publisher = {Springer}, year = {2020}, }
Glasmachers, T., & Krause, O. (2020). The Hessian Estimation Evolution Strategy. In Parallel Problem Solving from Nature (PPSN XVII). Springer.Convergence Analysis of the Hessian Estimation Evolution StrategyGlasmachers, T., & Krause, O.arxiv.org@techreport{GlasmachersKrause2020b, author = {Glasmachers, Tobias and Krause, Oswin}, title = {Convergence Analysis of the Hessian Estimation Evolution Strategy}, number = {2009.02732}, year = {2020}, }
Glasmachers, T., & Krause, O. (2020). Convergence Analysis of the Hessian Estimation Evolution Strategy (No. 2009.02732). arxiv.org.Latent Representation Prediction NetworksHlynsson, H. D., Schüler, M., Schiewer, R., Glasmachers, T., & Wiskott, L.arXiv preprint arXiv:2009.09439@article{HlynssonSchülerSchiewerEtAl2020, author = {Hlynsson, Hlynur Davíð and Schüler, Merlin and Schiewer, Robin and Glasmachers, Tobias and Wiskott, Laurenz}, title = {Latent Representation Prediction Networks}, journal = {arXiv preprint arXiv:2009.09439}, year = {2020}, }
Hlynsson, H. D., Schüler, M., Schiewer, R., Glasmachers, T., & Wiskott, L.. (2020). Latent Representation Prediction Networks. arXiv preprint arXiv:2009.09439.Non-local Optimization: Imposing Structure on Optimization Problems by RelaxationMüller, N., & Glasmachers, T.arXiv.org@techreport{MüllerGlasmachers2020, author = {Müller, Nils and Glasmachers, Tobias}, title = {Non-local Optimization: Imposing Structure on Optimization Problems by Relaxation}, institution = {arXiv.org}, number = {2011.06064}, year = {2020}, }
Müller, N., & Glasmachers, T.. (2020). Non-local Optimization: Imposing Structure on Optimization Problems by Relaxation (No. 2011.06064). arXiv.org.Analyzing Reinforcement Learning Benchmarks with Random Weight GuessingOller, D., Cuccu, G., & Glasmachers, T.In International Conference on Autonomous Agents and Multi-Agent Systems@inproceedings{OllerCuccuGlasmachers2020, author = {Oller, Declan and Cuccu, Giuseppe and Glasmachers, Tobias}, title = {Analyzing Reinforcement Learning Benchmarks with Random Weight Guessing}, booktitle = {International Conference on Autonomous Agents and Multi-Agent Systems}, year = {2020}, }
Oller, D., Cuccu, G., & Glasmachers, T.. (2020). Analyzing Reinforcement Learning Benchmarks with Random Weight Guessing. In International Conference on Autonomous Agents and Multi-Agent Systems.SpikeDeep-Classifier: A deep-learning based fully automatic offline spike sorting algorithmSaif-ur-Rehman, M., Ali, O., Dyck, S., Lienkämper, R., Metzler, M., Parpaley, Y., et al.Journal of Neural Engineering@article{Saif-ur-RehmanAliDyckEtAl2020, author = {Saif-ur-Rehman, Muhammad and Ali, Omair and Dyck, Susanne and Lienkämper, Robin and Metzler, Marita and Parpaley, Yaroslav and Wellmer, Jörg and Liu, Charles and Lee, Brian and Kellis, Spencer and Andersen, Richard and Iossifidis, Ioannis and Glasmachers, Tobias and Klaes, Christian}, title = {SpikeDeep-Classifier: A deep-learning based fully automatic offline spike sorting algorithm}, journal = {Journal of Neural Engineering}, year = {2020}, }
Saif-ur-Rehman, M., Ali, O., Dyck, S., Lienkämper, R., Metzler, M., Parpaley, Y., et al. (2020). SpikeDeep-Classifier: A deep-learning based fully automatic offline spike sorting algorithm. Journal of Neural Engineering.AI for Social Good: Unlocking the Opportunity for Positive ImpactTomašev, N., Cornebise, J., Hutter, F., Picciariello, A., Connelly, B., Belgrave, D. C. M., et al.Nature Communications, (2468)@article{TomaševCornebiseHutterEtAl2020, author = {Tomašev, Nenad and Cornebise, Julien and Hutter, Frank and Picciariello, Angela and Connelly, Bec and Belgrave, Danielle C. M. and Ezer, Daphne and Cachat van der Haert, Fanny and Mugisha, Frank and , Gerald Abila and Arai, Hiromi and Almiraat, Hisham and Proskurnia, Julia and , Kyle Snyder and Otake, Mihoko and Othman, Mustafa and Mohamed, Shakir and Glasmachers, Tobias and de Wever, Wilfried and Teh, Yee Whye and Khan, Mohammad Emtiyaz and De Winne, Ruben and 1, Tom Schaul and Clopath, Claudia}, title = {AI for Social Good: Unlocking the Opportunity for Positive Impact}, journal = {Nature Communications}, number = {2468}, year = {2020}, }
Tomašev, N., Cornebise, J., Hutter, F., Picciariello, A., Connelly, B., Belgrave, D. C. M., et al. (2020). AI for Social Good: Unlocking the Opportunity for Positive Impact. Nature Communications, (2468).2019
Moment Vector Encoding of Protein Sequences for Supervised ClassificationAltartouri, H., & Glasmachers, T.In Practical Applications of Computational Biology and Bioinformatics, 13th International Conference (pp. 25–35) Springer International Publishing@incollection{AltartouriGlasmachers2019, author = {Altartouri, Haneen and Glasmachers, Tobias}, title = {Moment Vector Encoding of Protein Sequences for Supervised Classification}, booktitle = {Practical Applications of Computational Biology and Bioinformatics, 13th International Conference}, pages = {25–35}, publisher = {Springer International Publishing}, month = {June}, year = {2019}, doi = {10.1007/978-3-030-23873-5_4}, }
Altartouri, H., & Glasmachers, T.. (2019). Moment Vector Encoding of Protein Sequences for Supervised Classification. In Practical Applications of Computational Biology and Bioinformatics, 13th International Conference (pp. 25–35). Springer International Publishing. http://doi.org/10.1007/978-3-030-23873-5_4Challenges of Convex Quadratic Bi-objective Benchmark ProblemsGlasmachers, T.In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO) (pp. 559–567) ACM@inproceedings{Glasmachers2019b, author = {Glasmachers, Tobias}, title = {Challenges of Convex Quadratic Bi-objective Benchmark Problems}, booktitle = {Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)}, pages = {559–567}, publisher = {ACM}, year = {2019}, }
Glasmachers, T. (2019). Challenges of Convex Quadratic Bi-objective Benchmark Problems. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO) (pp. 559–567). ACM.Global Convergence of the (1+1) Evolution StrategyGlasmachers, T.Evolutionary Computation Journal (ECJ)@article{Glasmachers2019, author = {Glasmachers, T.}, title = {Global Convergence of the (1+1) Evolution Strategy}, journal = {Evolutionary Computation Journal (ECJ)}, year = {2019}, }
Glasmachers, T. (2019). Global Convergence of the (1+1) Evolution Strategy. Evolutionary Computation Journal (ECJ).Boosting Reinforcement Learning with Unsupervised Feature ExtractionHakenes, S., & Glasmachers, T.In Artificial Neural Networks and Machine Learning – ICANN 2019: Theoretical Neural Computation (pp. 555–566) Springer International Publishing@incollection{HakenesGlasmachers2019, author = {Hakenes, Simon and Glasmachers, Tobias}, title = {Boosting Reinforcement Learning with Unsupervised Feature Extraction}, booktitle = {Artificial Neural Networks and Machine Learning – ICANN 2019: Theoretical Neural Computation}, pages = {555–566}, publisher = {Springer International Publishing}, year = {2019}, doi = {10.1007/978-3-030-30487-4_43}, }
Hakenes, S., & Glasmachers, T.. (2019). Boosting Reinforcement Learning with Unsupervised Feature Extraction. In Artificial Neural Networks and Machine Learning – ICANN 2019: Theoretical Neural Computation (pp. 555–566). Springer International Publishing. http://doi.org/10.1007/978-3-030-30487-4_43Vehicle Shape and Color Classification Using Convolutional NeuralNetworkNafzi, M., Brauckmann, M., & Glasmachers, T.arxiv.org@techreport{NafziBrauckmannGlasmachers2019, author = {Nafzi, Mohamed and Brauckmann, Michael and Glasmachers, Tobias}, title = {Vehicle Shape and Color Classification Using Convolutional NeuralNetwork}, number = {1905.08612}, year = {2019}, }
Nafzi, M., Brauckmann, M., & Glasmachers, T.. (2019). Vehicle Shape and Color Classification Using Convolutional NeuralNetwork (No. 1905.08612). arxiv.org.Dual SVM Training on a BudgetQaadan, S., Schüler, M., & Glasmachers, T.In Proceedings of the 8th International Conference on Pattern Recognition Applications and Methods SCITEPRESS - Science and Technology Publications@inproceedings{QaadanSchülerGlasmachers2019, author = {Qaadan, Sahar and Schüler, Merlin and Glasmachers, Tobias}, title = {Dual SVM Training on a Budget}, booktitle = {Proceedings of the 8th International Conference on Pattern Recognition Applications and Methods}, publisher = {SCITEPRESS - Science and Technology Publications}, year = {2019}, }
Qaadan, S., Schüler, M., & Glasmachers, T.. (2019). Dual SVM Training on a Budget. In Proceedings of the 8th International Conference on Pattern Recognition Applications and Methods. SCITEPRESS - Science and Technology Publications.Modeling Macroscopic Material Behavior With Machine Learning Algorithms Trained by Micromechanical SimulationsReimann, D., Nidadavolu, K., ul Hassan, H., Vajragupta, N., Glasmachers, T., Junker, P., & Hartmaier, A.Frontiers in Materials, 6, 181@article{ReimannNidadavoluul HassanEtAl2019, author = {Reimann, Denise and Nidadavolu, Kapil and ul Hassan, Hamad and Vajragupta, Napat and Glasmachers, Tobias and Junker, Philipp and Hartmaier, Alexander}, title = {Modeling Macroscopic Material Behavior With Machine Learning Algorithms Trained by Micromechanical Simulations}, journal = {Frontiers in Materials}, volume = {6}, pages = {181}, year = {2019}, doi = {10.3389/fmats.2019.00181}, }
Reimann, D., Nidadavolu, K., ul Hassan, H., Vajragupta, N., Glasmachers, T., Junker, P., & Hartmaier, A. (2019). Modeling Macroscopic Material Behavior With Machine Learning Algorithms Trained by Micromechanical Simulations. Frontiers in Materials, 6, 181. http://doi.org/10.3389/fmats.2019.00181SpikeDeeptector: A deep-learning based method for detection of neural spiking activitySaif-ur-Rehman, M., Lienkämper, R., Parpaley, Y., Wellmer, J., Liu, C., Lee, B., et al.Journal of Neural Engineering@article{Saif-ur-RehmanLienkämperParpaleyEtAl2019, author = {Saif-ur-Rehman, Muhammad and Lienkämper, Robin and Parpaley, Yaroslav and Wellmer, Jörg and Liu, Charles and Lee, Brian and Kellis, Spencer and Andersen, Richard and Iossifidis, Ioannis and Glasmachers, Tobias and Klaes, Christian}, title = {SpikeDeeptector: A deep-learning based method for detection of neural spiking activity}, journal = {Journal of Neural Engineering}, year = {2019}, }
Saif-ur-Rehman, M., Lienkämper, R., Parpaley, Y., Wellmer, J., Liu, C., Lee, B., et al. (2019). SpikeDeeptector: A deep-learning based method for detection of neural spiking activity. Journal of Neural Engineering.2018
Drift Theory in Continuous Search Spaces: Expected Hitting Time of the (1+1)-ES with 1/5 Success RuleAkimoto, Y., Auger, A., & Glasmachers, T.In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO) ACM@inproceedings{AkimotoAugerGlasmachers2018, author = {Akimoto, Youhei and Auger, Anne and Glasmachers, Tobias}, title = {Drift Theory in Continuous Search Spaces: Expected Hitting Time of the (1+1)-ES with 1/5 Success Rule}, booktitle = {Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)}, publisher = {ACM}, year = {2018}, }
Akimoto, Y., Auger, A., & Glasmachers, T.. (2018). Drift Theory in Continuous Search Spaces: Expected Hitting Time of the (1+1)-ES with 1/5 Success Rule. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO). ACM.Speeding Up Budgeted Dual SVM Training with Precomputed GSSGlasmachers, T., & Qaadan, S.(M. M. -y-G. Ruben Vera-Rodriguez Sergio Velastin & Morales, A., Eds.), The 23rd Iberoamerican Congress on Pattern Recognition@postproceedings{GlasmachersQaadan2018b, author = {Glasmachers, Tobias and Qaadan, Sahar}, title = {Speeding Up Budgeted Dual SVM Training with Precomputed GSS}, year = {2018}, }
Glasmachers, T., & Qaadan, S.. (2018). Speeding Up Budgeted Dual SVM Training with Precomputed GSS. (M. M. -y-G. Ruben Vera-Rodriguez Sergio Velastin & Morales, A., Eds.), The 23rd Iberoamerican Congress on Pattern Recognition.Speeding Up Budgeted Stochastic Gradient Descent SVM Training with Precomputed Golden Section SearchGlasmachers, T., & Qaadan, S.In G. Nicosia, Pardalos, P., Giuffrida, G., Umeton, R., & Sciacca, V. (Eds.), The 4th International Conference on machine Learning, Optimization and Data science - LOD 2018@inproceedings{GlasmachersQaadan2018, author = {Glasmachers, Tobias and Qaadan, Sahar}, title = {Speeding Up Budgeted Stochastic Gradient Descent SVM Training with Precomputed Golden Section Search}, booktitle = {The 4th International Conference on machine Learning, Optimization and Data science - LOD 2018}, editor = {Nicosia, Giuseppe and Pardalos, Panos and Giuffrida, Giovanni and Umeton, Renato and Sciacca, Vincenzo}, year = {2018}, }
Glasmachers, T., & Qaadan, S.. (2018). Speeding Up Budgeted Stochastic Gradient Descent SVM Training with Precomputed Golden Section Search. In G. Nicosia, Pardalos, P., Giuffrida, G., Umeton, R., & Sciacca, V. (Eds.), The 4th International Conference on machine Learning, Optimization and Data science - LOD 2018.Large Scale Black-box Optimization by Limited-Memory Matrix AdaptationLoshchilov, I., Glasmachers, T., & Beyer, H. -G.IEEE Transactions on Evolutionary Computation, 99@article{LoshchilovGlasmachersBeyer2018, author = {Loshchilov, Ilya and Glasmachers, Tobias and Beyer, Hans-Georg}, title = {Large Scale Black-box Optimization by Limited-Memory Matrix Adaptation}, journal = {IEEE Transactions on Evolutionary Computation}, volume = {99}, year = {2018}, }
Loshchilov, I., Glasmachers, T., & Beyer, H. -G. (2018). Large Scale Black-box Optimization by Limited-Memory Matrix Adaptation. IEEE Transactions on Evolutionary Computation, 99.Challenges in High-dimensional Controller Design with Evolution StrategiesMüller, N., & Glasmachers, T.In Parallel Problem Solving from Nature (PPSN XVI) Springer@inproceedings{MüllerGlasmachers2018, author = {Müller, Nils and Glasmachers, Tobias}, title = {Challenges in High-dimensional Controller Design with Evolution Strategies}, booktitle = {Parallel Problem Solving from Nature (PPSN XVI)}, publisher = {Springer}, year = {2018}, }
Müller, N., & Glasmachers, T.. (2018). Challenges in High-dimensional Controller Design with Evolution Strategies. In Parallel Problem Solving from Nature (PPSN XVI). Springer.Multi-Merge Budget Maintenance for Stochastic Gradient Descent SVM TrainingQaadan, S., & Glasmachers, T.13th WiML Workshop, Co-located with NeurIPS, Montreal, QC, Canada@inproceeding{QaadanGlasmachers2018c, author = {Qaadan, Sahar and Glasmachers, Tobias}, title = {Multi-Merge Budget Maintenance for Stochastic Gradient Descent SVM Training}, year = {2018}, }
Qaadan, S., & Glasmachers, T.. (2018). Multi-Merge Budget Maintenance for Stochastic Gradient Descent SVM Training. 13th WiML Workshop, Co-located with NeurIPS, Montreal, QC, Canada.Multi-Merge Budget Maintenance for Stochastic Gradient Descent SVM TrainingQaadan, S., & Glasmachers, T.arXiv.org@techreport{QaadanGlasmachers2018, author = {Qaadan, Sahar and Glasmachers, Tobias}, title = {Multi-Merge Budget Maintenance for Stochastic Gradient Descent SVM Training}, institution = {arXiv.org}, number = {arXiv:1806.10179}, year = {2018}, }
Qaadan, S., & Glasmachers, T.. (2018). Multi-Merge Budget Maintenance for Stochastic Gradient Descent SVM Training (No. arXiv:1806.10179). arXiv.org.Multi-Merge Budget Maintenance for Stochastic Coordinate Ascent SVM TrainingQaadan, S., & Glasmachers, T.Artificial Intelligence International Conference – A2IC 2018@postproceedings{QaadanGlasmachers2018b, author = {Qaadan, Sahar and Glasmachers, Tobias}, title = {Multi-Merge Budget Maintenance for Stochastic Coordinate Ascent SVM Training}, year = {2018}, }
Qaadan, S., & Glasmachers, T.. (2018). Multi-Merge Budget Maintenance for Stochastic Coordinate Ascent SVM Training. Artificial Intelligence International Conference – A2IC 2018.User-Centered Development of a Pedestrian Assistance System Using End-to-End LearningQureshi, H. S., Glasmachers, T., & Wiczorek, R.In 17th IEEE International Conference on Machine Learning and Applications (ICMLA) (pp. 808–813) IEEE@inproceedings{QureshiGlasmachersWiczorek2018, author = {Qureshi, Hasham Shahid and Glasmachers, Tobias and Wiczorek, Rebecca}, title = {User-Centered Development of a Pedestrian Assistance System Using End-to-End Learning}, booktitle = {17th IEEE International Conference on Machine Learning and Applications (ICMLA)}, pages = {808–813}, organization = {IEEE}, year = {2018}, }
Qureshi, H. S., Glasmachers, T., & Wiczorek, R. (2018). User-Centered Development of a Pedestrian Assistance System Using End-to-End Learning. In 17th IEEE International Conference on Machine Learning and Applications (ICMLA) (pp. 808–813). IEEE.2017
A Fast Incremental BSP Tree Archive for Non-dominated PointsGlasmachers, T.In Evolutionary Multi-Criterion Optimization (EMO) Springer@inproceedings{Glasmachers2017, author = {Glasmachers, T.}, title = {A Fast Incremental BSP Tree Archive for Non-dominated Points}, booktitle = {Evolutionary Multi-Criterion Optimization (EMO)}, publisher = {Springer}, year = {2017}, }
Glasmachers, T. (2017). A Fast Incremental BSP Tree Archive for Non-dominated Points. In Evolutionary Multi-Criterion Optimization (EMO). Springer.Limits of End-to-End LearningGlasmachers, T.In Proceedings of the 9th Asian Conference on Machine Learning (ACML)@inproceedings{Glasmachers2017b, author = {Glasmachers, T.}, title = {Limits of End-to-End Learning}, booktitle = {Proceedings of the 9th Asian Conference on Machine Learning (ACML)}, year = {2017}, }
Glasmachers, T. (2017). Limits of End-to-End Learning. In Proceedings of the 9th Asian Conference on Machine Learning (ACML).Texture attribute synthesis and transfer using feed-forward CNNsIrmer, T., Glasmachers, T., & Maji, S.In Winter Conference on Applications of Computer Vision (WACV) IEEE@inproceedings{IrmerGlasmachersMaji2017, author = {Irmer, T. and Glasmachers, T. and Maji, S.}, title = {Texture attribute synthesis and transfer using feed-forward CNNs}, booktitle = {Winter Conference on Applications of Computer Vision (WACV)}, publisher = {IEEE}, year = {2017}, }
Irmer, T., Glasmachers, T., & Maji, S. (2017). Texture attribute synthesis and transfer using feed-forward CNNs. In Winter Conference on Applications of Computer Vision (WACV). IEEE.Qualitative and Quantitative Assessment of Step Size Adaptation RulesKrause, O., Glasmachers, T., & Igel, C.In Conference on Foundations of Genetic Algorithms (FOGA) ACM@inproceedings{KrauseGlasmachersIgel2017, author = {Krause, O. and Glasmachers, T. and Igel, C.}, title = {Qualitative and Quantitative Assessment of Step Size Adaptation Rules}, booktitle = {Conference on Foundations of Genetic Algorithms (FOGA)}, publisher = {ACM}, year = {2017}, }
Krause, O., Glasmachers, T., & Igel, C. (2017). Qualitative and Quantitative Assessment of Step Size Adaptation Rules. In Conference on Foundations of Genetic Algorithms (FOGA). ACM.2016
Fast model selection by limiting SVM training timesDemircioğlu, A., Horn, D., Glasmachers, T., Bischl, B., & Weihs, C.arxiv.org@techreport{DemircioğluHornGlasmachersEtAl2016, author = {Demircioğlu, A. and Horn, D. and Glasmachers, T. and Bischl, B. and Weihs, C.}, title = {Fast model selection by limiting SVM training times}, institution = {arxiv.org}, number = {arxiv:1302.1602.03368v1}, year = {2016}, }
Demircioğlu, A., Horn, D., Glasmachers, T., Bischl, B., & Weihs, C. (2016). Fast model selection by limiting SVM training times (No. arxiv:1302.1602.03368v1). arxiv.org.A Unified View on Multi-class Support Vector ClassificationDoğan, Ü., Glasmachers, T., & Igel, C.Journal of Machine Learning Research, 17(45), 1–32@article{DoğanGlasmachersIgel2016, author = {Doğan, Ü. and Glasmachers, T. and Igel, C.}, title = {A Unified View on Multi-class Support Vector Classification}, journal = {Journal of Machine Learning Research}, volume = {17}, number = {45}, pages = {1–32}, year = {2016}, }
Doğan, Ü., Glasmachers, T., & Igel, C. (2016). A Unified View on Multi-class Support Vector Classification. Journal of Machine Learning Research, 17(45), 1–32.Finite Sum Acceleration vs. Adaptive Learning Rates for the Training of Kernel Machines on a BudgetGlasmachers, T.In NIPS workshop on Optimization for Machine Learning@inproceedings{Glasmachers2016, author = {Glasmachers, T.}, title = {Finite Sum Acceleration vs. Adaptive Learning Rates for the Training of Kernel Machines on a Budget}, booktitle = {NIPS workshop on Optimization for Machine Learning}, year = {2016}, }
Glasmachers, T. (2016). Finite Sum Acceleration vs. Adaptive Learning Rates for the Training of Kernel Machines on a Budget. In NIPS workshop on Optimization for Machine Learning.Small Stochastic Average Gradient Steps