Sep 3 – 4, 2025
Hörsaalgebäude, Campus Poppelsdorf, Universität Bonn
Europe/Berlin timezone

Load Balancing Neurons: Controlling Firing Rates Improves Plasticity in Continual Learning

Not scheduled
1h 30m
Open Space (first floor)

Open Space (first floor)

Poster Hybrid ML Poster Session

Speaker

Jan Robine (TU Dortmund)

Description

Neural networks often suffer from plasticity loss, which limits their ability to adapt to evolving data distributions in continual learning settings. This results in degraded performance, poor generalization, and inefficient use of model capacity. While recent methods mitigate this by resetting underutilized neurons based on utility scores, the underlying mechanisms remain poorly understood. In this work, we propose the firing rate as a simple, activation-independent metric to diagnose neuron inactivity and dysfunction, such as dead or overly linearized ReLUs. Building on this insight, we introduce a load balancing mechanism that dynamically adjusts neuron activation thresholds to maintain healthy firing rates. We further show that architectural techniques such as non-affine normalization and L2 regularization implicitly promote balanced activity and improve plasticity. Across two continual learning benchmarks, our methods lead to substantial improvements in test accuracy, surpassing both continual backpropagation and reset-based baselines.

Author

Jan Robine (TU Dortmund)

Co-authors

Presentation materials

There are no materials yet.