Jump to main content
Neurorobotik
Current Topic
Neurorobotik 

Secondary Matrices: Incremental Learning by Individual Learning Rates

Conceptual illustration of secondary matrices controlling neuron learning rates. Project image

Neural networks are characterized by their use of (primary) matrices for synaptic connection values (and sometimes intrinsic or bias terms for neurons). In this student project we want to explore secondary matrices for their practical use. We use internal parameter (IP) values for neurons to set an individual learning rate for each neuron. When updating the primary matrices, the IP (a positive scalar value) from the secondary matrix defines by how much the value is updated. In other words, some neurons and synapses are stable, and do not respond (much) to input. They continue to contribute fixed values to the network. Others are highly flexible and respond and adapt to input patterns. The IP values define the difference between stability and flexibility of the primary matrix values. The IP values can be learned in several ways, e.g. locally, or globally based on error signals. We want to compare and evaluate different methods for setting IP values in the secondary matrix. The goal is to implement efficient incremental learning. The database will have basic (stable) and additional (variable) patterns.

Requirements (Master):

  • Experience in Computational Neuroscience and Machine Learning
 

Advisor: