My current research interests are:
- scalable and energy efficient machine learning
- understanding and developing algorithms for continual learning
- creating algorithms uniquely suited for recurrent neural networks
A lot of my work derives inspiration from neuroscience and biology in the quest to build a better and more general artificial intelligence.
Selected Publications
-
Efficient Recurrent Architectures through Activity Sparsity and Sparse Back-Propagation through TimeSubramoney, A., Nazeer, K. K., Schöne, M., Mayr, C., & Kappel, D.In International Conference on Learning Representations
@inproceedings{SubramoneyNazeerSchöneEtAl2023, author = {Subramoney, Anand and Nazeer, Khaleelulla Khan and Schöne, Mark and Mayr, Christian and Kappel, David}, title = {Efficient Recurrent Architectures through Activity Sparsity and Sparse Back-Propagation through Time}, booktitle = {International Conference on Learning Representations}, month = {May}, year = {2023}, }
Subramoney, A., Nazeer, K. K., Schöne, M., Mayr, C., & Kappel, D.. (2023). Efficient Recurrent Architectures through Activity Sparsity and Sparse Back-Propagation through Time. In International Conference on Learning Representations. Retrieved from https://openreview.net/forum?id=lJdOlWg8td-
Efficient Recurrent Architectures through Activity Sparsity and Sparse Back-Propagation through TimeSubramoney, A., Nazeer, K. K., Schöne, M., Mayr, C., & Kappel, D.In International Conference on Learning Representations
@inproceedings{SubramoneyNazeerSchöneEtAl2023, author = {Subramoney, Anand and Nazeer, Khaleelulla Khan and Schöne, Mark and Mayr, Christian and Kappel, David}, title = {Efficient Recurrent Architectures through Activity Sparsity and Sparse Back-Propagation through Time}, booktitle = {International Conference on Learning Representations}, month = {May}, year = {2023}, }
Subramoney, A., Nazeer, K. K., Schöne, M., Mayr, C., & Kappel, D.. (2023). Efficient Recurrent Architectures through Activity Sparsity and Sparse Back-Propagation through Time. In International Conference on Learning Representations. Retrieved from https://openreview.net/forum?id=lJdOlWg8tdEfficient Real Time Recurrent Learning through combined activity and parameter sparsity@misc{Subramoney2023, author = {Subramoney, Anand}, title = {Efficient Real Time Recurrent Learning through combined activity and parameter sparsity}, year = {2023}, }
Subramoney, A. (2023). Efficient Real Time Recurrent Learning through combined activity and parameter sparsity.2022
Exploring parameter and hyper-parameter spaces of neuroscience models on high performance computers with Learning to LearnYegenoglu, A., Subramoney, A., Hater, T., Jimenez-Romero, C., Klijn, W., Pérez Martín, A., et al.Frontiers in Computational Neuroscience, 46@article{YegenogluSubramoneyHaterEtAl2022, author = {Yegenoglu, Alper and Subramoney, Anand and Hater, Thorsten and Jimenez-Romero, Cristian and Klijn, Wouter and Pérez Martín, Aarón and van der Vlag, Michiel and Herty, Michael and Morrison, Abigail and Diaz Pier, Sandra}, title = {Exploring parameter and hyper-parameter spaces of neuroscience models on high performance computers with Learning to Learn}, journal = {Frontiers in Computational Neuroscience}, pages = {46}, year = {2022}, }
Yegenoglu, A., Subramoney, A., Hater, T., Jimenez-Romero, C., Klijn, W., Pérez Martín, A., et al. (2022). Exploring parameter and hyper-parameter spaces of neuroscience models on high performance computers with Learning to Learn. Frontiers in Computational Neuroscience, 46. Retrieved from https://doi.org/10.3389/fncom.2022.885207Winter Term 2022/2023
Seminars Topics in Deep Learning for Sequence Processing Winter Term 2021/2022
Seminars Topics in Deep Learning for Sequence Processing Summer Term 2022
Lectures Introduction to Artificial Intelligence Summer Term 2021
Lab courses Introduction to Python -
Memory Modules for Deep LearningHark, N.Master’s thesis, Institute of Neural Computation, Ruhr University Bochum, Bochum, Germany
@mastersthesis{Hark2022, author = {Hark, Niklas}, title = {Memory Modules for Deep Learning}, school = {Institute of Neural Computation, Ruhr University Bochum}, address = {Bochum, Germany}, month = {May}, year = {2022}, }
Hark, N. (2022, May). Memory Modules for Deep Learning. Master’s thesis, Institute of Neural Computation, Ruhr University Bochum, Bochum, Germany.
2022
The Institut für Neuroinformatik (INI) is a central research unit of the Ruhr-Universität Bochum. We aim to understand the fundamental principles through which organisms generate behavior and cognition while linked to their environments through sensory systems and while acting in those environments through effector systems. Inspired by our insights into such natural cognitive systems, we seek new solutions to problems of information processing in artificial cognitive systems. We draw from a variety of disciplines that include experimental approaches from psychology and neurophysiology as well as theoretical approaches from physics, mathematics, electrical engineering and applied computer science, in particular machine learning, artificial intelligence, and computer vision.
Universitätsstr. 150, Building NB, Room 3/32
D-44801 Bochum, GermanyTel: (+49) 234 32-28967
Fax: (+49) 234 32-14210
2023
-