I am a PhD student at Chalmers University of Technology. My research interests are biologically motivated learning algorithms and energy based models.
16:40 UTC
A wide range of research on feedforward neural networks requires "bending" the chain rule during backpropagation. The package Bender.jl provides neural network layers (compatible with Flux.jl), which gives users more freedom to choose every aspect of the forward mapping. This makes it easy to leverage ChainRules.jl to compose a wide range of experiments, such as training binary neural networks, Feedback Alignment and Direct Feedback Alignment in just a few lines of code.