- RUB
- Computer Science
- INI
- People
- Dr. Anand Subramoney
My current research interests are:
- scalable and energy efficient machine learning
- understanding and developing algorithms for continual learning
- creating algorithms uniquely suited for recurrent neural networks
A lot of my work derives inspiration from neuroscience and biology in the quest to build a better and more general artificial intelligence.
Selected Publications
-
Beyond Weights: Deep learning in Spiking Neural Networks with pure synaptic-delay trainingGrappolini, E. W., & Subramoney, A.In International Conference on Neuromorphic Systems (ICONS ′23), Santa Fe, NM, USA ACM
@inproceedings{GrappoliniSubramoney2023, author = {Grappolini, Edoardo W. and Subramoney, Anand}, title = {Beyond Weights: Deep learning in Spiking Neural Networks with pure synaptic-delay training}, booktitle = {International Conference on Neuromorphic Systems (ICONS ′23), Santa Fe, NM, USA}, publisher = {ACM}, month = {June}, year = {2023}, }
Grappolini, E. W., & Subramoney, A.. (2023). Beyond Weights: Deep learning in Spiking Neural Networks with pure synaptic-delay training. In International Conference on Neuromorphic Systems (ICONS ′23), Santa Fe, NM, USA. ACM.Efficient Recurrent Architectures through Activity Sparsity and Sparse Back-Propagation through TimeSubramoney, A., Nazeer, K. K., Schöne, M., Mayr, C., & Kappel, D.In International Conference on Learning Representations@inproceedings{SubramoneyNazeerSchöneEtAl2023, author = {Subramoney, Anand and Nazeer, Khaleelulla Khan and Schöne, Mark and Mayr, Christian and Kappel, David}, title = {Efficient Recurrent Architectures through Activity Sparsity and Sparse Back-Propagation through Time}, booktitle = {International Conference on Learning Representations}, month = {May}, year = {2023}, }
Subramoney, A., Nazeer, K. K., Schöne, M., Mayr, C., & Kappel, D.. (2023). Efficient Recurrent Architectures through Activity Sparsity and Sparse Back-Propagation through Time. In International Conference on Learning Representations. Retrieved from https://openreview.net/forum?id=lJdOlWg8tdEfficient Real Time Recurrent Learning through Combined Activity and Parameter SparsityAnand Subramoney,In ICLR 2023 Workshop on Sparse Neural Networks@inproceedings{Anand Subramoney2023, author = {Anand Subramoney}, title = {Efficient Real Time Recurrent Learning through Combined Activity and Parameter Sparsity}, booktitle = {ICLR 2023 Workshop on Sparse Neural Networks}, month = {March}, year = {2023}, doi = {10.48550/arXiv.2303.05641}, }
Anand Subramoney,. (2023). Efficient Real Time Recurrent Learning through Combined Activity and Parameter Sparsity. In ICLR 2023 Workshop on Sparse Neural Networks. http://doi.org/10.48550/arXiv.2303.05641-
Exploring the limits of hierarchical world models in reinforcement learningSchiewer, R., Subramoney, A., & Wiskott, L.Scientific Reports, 14(1)
@article{SchiewerSubramoneyWiskott2024, author = {Schiewer, Robin and Subramoney, Anand and Wiskott, Laurenz}, title = {Exploring the limits of hierarchical world models in reinforcement learning}, journal = {Scientific Reports}, volume = {14}, number = {1}, month = {November}, year = {2024}, doi = {10.1038/s41598-024-76719-w}, }
Schiewer, R., Subramoney, A., & Wiskott, L.. (2024). Exploring the limits of hierarchical world models in reinforcement learning. Scientific Reports, 14(1). http://doi.org/10.1038/s41598-024-76719-w2023
Beyond Weights: Deep learning in Spiking Neural Networks with pure synaptic-delay trainingGrappolini, E. W., & Subramoney, A.In International Conference on Neuromorphic Systems (ICONS ′23), Santa Fe, NM, USA ACM@inproceedings{GrappoliniSubramoney2023, author = {Grappolini, Edoardo W. and Subramoney, Anand}, title = {Beyond Weights: Deep learning in Spiking Neural Networks with pure synaptic-delay training}, booktitle = {International Conference on Neuromorphic Systems (ICONS ′23), Santa Fe, NM, USA}, publisher = {ACM}, month = {June}, year = {2023}, }
Grappolini, E. W., & Subramoney, A.. (2023). Beyond Weights: Deep learning in Spiking Neural Networks with pure synaptic-delay training. In International Conference on Neuromorphic Systems (ICONS ′23), Santa Fe, NM, USA. ACM.Efficient Recurrent Architectures through Activity Sparsity and Sparse Back-Propagation through TimeSubramoney, A., Nazeer, K. K., Schöne, M., Mayr, C., & Kappel, D.In International Conference on Learning Representations@inproceedings{SubramoneyNazeerSchöneEtAl2023, author = {Subramoney, Anand and Nazeer, Khaleelulla Khan and Schöne, Mark and Mayr, Christian and Kappel, David}, title = {Efficient Recurrent Architectures through Activity Sparsity and Sparse Back-Propagation through Time}, booktitle = {International Conference on Learning Representations}, month = {May}, year = {2023}, }
Subramoney, A., Nazeer, K. K., Schöne, M., Mayr, C., & Kappel, D.. (2023). Efficient Recurrent Architectures through Activity Sparsity and Sparse Back-Propagation through Time. In International Conference on Learning Representations. Retrieved from https://openreview.net/forum?id=lJdOlWg8tdEfficient Real Time Recurrent Learning through Combined Activity and Parameter SparsityAnand Subramoney,In ICLR 2023 Workshop on Sparse Neural Networks@inproceedings{Anand Subramoney2023, author = {Anand Subramoney}, title = {Efficient Real Time Recurrent Learning through Combined Activity and Parameter Sparsity}, booktitle = {ICLR 2023 Workshop on Sparse Neural Networks}, month = {March}, year = {2023}, doi = {10.48550/arXiv.2303.05641}, }
Anand Subramoney,. (2023). Efficient Real Time Recurrent Learning through Combined Activity and Parameter Sparsity. In ICLR 2023 Workshop on Sparse Neural Networks. http://doi.org/10.48550/arXiv.2303.056412022
Exploring parameter and hyper-parameter spaces of neuroscience models on high performance computers with Learning to LearnYegenoglu, A., Subramoney, A., Hater, T., Jimenez-Romero, C., Klijn, W., Pérez Martín, A., et al.Frontiers in Computational Neuroscience, 46@article{YegenogluSubramoneyHaterEtAl2022, author = {Yegenoglu, Alper and Subramoney, Anand and Hater, Thorsten and Jimenez-Romero, Cristian and Klijn, Wouter and Pérez Martín, Aarón and van der Vlag, Michiel and Herty, Michael and Morrison, Abigail and Diaz Pier, Sandra}, title = {Exploring parameter and hyper-parameter spaces of neuroscience models on high performance computers with Learning to Learn}, journal = {Frontiers in Computational Neuroscience}, pages = {46}, year = {2022}, }
Yegenoglu, A., Subramoney, A., Hater, T., Jimenez-Romero, C., Klijn, W., Pérez Martín, A., et al. (2022). Exploring parameter and hyper-parameter spaces of neuroscience models on high performance computers with Learning to Learn. Frontiers in Computational Neuroscience, 46. Retrieved from https://doi.org/10.3389/fncom.2022.885207Winter Term 2022/2023
Seminars Topics in Deep Learning for Sequence Processing Winter Term 2021/2022
Seminars Topics in Deep Learning for Sequence Processing Summer Term 2022
Lectures Introduction to Artificial Intelligence Summer Term 2021
Lab courses Introduction to Python -
Memory Modules for Deep LearningHark, N.Master’s thesis, Institute of Neural Computation, Ruhr University Bochum, Bochum, Germany
@mastersthesis{Hark2022, author = {Hark, Niklas}, title = {Memory Modules for Deep Learning}, school = {Institute of Neural Computation, Ruhr University Bochum}, address = {Bochum, Germany}, month = {May}, year = {2022}, }
Hark, N. (2022, May). Memory Modules for Deep Learning. Master’s thesis, Institute of Neural Computation, Ruhr University Bochum, Bochum, Germany.
2022
The Institut für Neuroinformatik (INI) is a research unit of the Faculty of Computer Science at the Ruhr-Universität Bochum. Its scientific goal is to understand the fundamental principles through which organisms generate behavior and cognition while linked to their environments through sensory and effector systems. Inspired by our insights into such natural cognitive systems, we seek new solutions to problems of information processing in artificial cognitive systems. We draw from a variety of disciplines that include experimental psychology and neurophysiology as well as machine learning, neural artificial intelligence, computer vision, and robotics.
Universitätsstr. 150, Building NB, Room 3/32
D-44801 Bochum, GermanyTel: (+49) 234 32-28967
Fax: (+49) 234 32-14210
2024