Sustainable Machine Learning

Modern machine learning (ML) architectures consume unprecedented amounts of energy, for a single training session often exceeding the energy and carbon footprint of a car during its entire lifetime. If the current growth rate continues ML models may outrun the traffic sector in the global energy balance in 10-20 years. Biological brains, by contrast, are extremely energy efficient. The Sustainable Machine Learning group identifies the mechanisms that enable the striking energy-efficiency of biological brains and explores new approaches to significantly reduce the energy footprint of machine learning using hybrid ML/bio-inspired models.

Our mission

Our mission is to develop state-of-the-art machine learning models that scale to real-world problems while being energy-efficient. These models are based on core design principles of sparsity and asynchrony and take inspiration from biology/neuroscience.

Research projects
  • EVENTS (Energy-efficient distributed sensor-systems for machine vision: event-based distributed AI algorithms) started in October 2022 as part of the funding program "BMBF - OCTOPUS - Electronic systems for trustworthy and energy-efficient decentralised dataprocessing in edge-computing". The grant aims to develop efficient general-purpose AI algorithms that can be adapted for deployment on energy-efficient neuromorphic systems for computer vision. The project consortium led by TU Dresden will implement and test the algorithms developed at INI, RUB on innovative neuromorphic hardware in various pilot applications. For more information see: the EVENTS project website.
  • ESCADE (Energy-Efficient Large-Scale Artificial Intelligence for Sustainable Data Centers) started in May 2023. The aim of the project is to develop state-of-the art large, distributed and energy-efficient machine learning models for complex applications such as natural language processing. The project started in May 2023 and is funded by the Bundesministerium für Wirtschaft und Klimaschutz (BMWK). For more information see: the ESCADE project website.
      

    2023

  • Tunable synaptic working memory with volatile memristive devices
    Ricci, S., Kappel, D., Tetzlaff, C., Ielmini, D., & Covi, E.
    Neuromorphic Computing and Engineering, 3(4), 044004
  • CoBeL-RL: A neuroscience-oriented simulation framework for complex behavior and learning
    Diekmann, N., Vijayabaskaran, S., Zeng, X., Kappel, D., Menezes, M. C., & Cheng, S.
    Frontiers in Neuroinformatics, 17

Activity and parameter sparsity in recurrent networks

Recent advances in machine learning have demonstrated impressive performance on complex tasks such as human-level image understanding and natural language processing. However, the increase in size and performance of these models has been accompanied by an increase in their energy consumption. This development has led to a growing interest in sparse, energy-efficient models in recent years. In this project, we will investigate activity- and parameter sparsity in a recurrent neural network architecture. The dependence of these two types of sparsity will be studied and optimal trade-offs between performance and efficiency will be identified.

Efficient transformer networks for video object detection

In recent years, software products such as ChatGPT and DALLE have demonstrated a new quality of automated data processing based on machine learning. These models are based on deep transformer networks, which are at the forefront of today's machine learning research and show state-of-the-art performance in virtually every relevant task. However, the high resource and energy consumption of these models has been an obstacle to the widespread adoption of these networks. In this project, we will investigate approaches to exploit sparsity in transformer networks to make them more resource efficient.

The Institut für Neuroinformatik (INI) is a central research unit of the Ruhr-Universität Bochum. We aim to understand the fundamental principles through which organisms generate behavior and cognition while linked to their environments through sensory systems and while acting in those environments through effector systems. Inspired by our insights into such natural cognitive systems, we seek new solutions to problems of information processing in artificial cognitive systems. We draw from a variety of disciplines that include experimental approaches from psychology and neurophysiology as well as theoretical approaches from physics, mathematics, electrical engineering and applied computer science, in particular machine learning, artificial intelligence, and computer vision.

Universitätsstr. 150, Building NB, Room 3/32
D-44801 Bochum, Germany

Tel: (+49) 234 32-28967
Fax: (+49) 234 32-14210