Lux.jl: Explicit Parameterization of Neural Networks in Julia

07/28/2022, 1:00 PM — 1:10 PM UTC
Red

Abstract:

Julia already has quite a few well-established Neural Network Frameworks -- Flux & KNet. However, certain design elements -- Coupled Model and Parameters & Internal Mutations -- associated with these frameworks make them less compiler and user friendly. Making changes to address these problems in the respective frameworks would be too disruptive for users. To address these challenges, we designed Lux, a NN framework.

Description:

Lux, is a neural network framework built completely using pure functions to make it both compiler and automatic differentiation friendly. Relying on the most straightforward pure functions API ensures no reference issues to debug, and compilers can optimize it as much as possible, is compatible with Symbolics/XLA/etc. without any tricks.

Repository: https://github.com/avik-pal/ExplicitFluxLayers.jl/

Platinum sponsors

Julia ComputingRelational AIJulius Technology

Gold sponsors

IntelAWS

Silver sponsors

Invenia LabsBeacon BiosignalsMetalenzASMLG-ResearchConningPumas AIQuEra Computing Inc.Jeffrey Sarnoff

Media partners

Packt PublicationGather TownVercel

Community partners

Data UmbrellaWiMLDS

Fiscal Sponsor

NumFOCUS