Optimization.jl seeks to bring together all of the optimization packages it can find, local and global, into one unified Julia interface. This means, you learn one package and you learn them all! Optimization.jl adds a few high-level features, such as integrating with automatic differentiation, to make its usage fairly simple for most cases, while allowing all of the options in a single unified interface.
Optimization.jl wraps most of the major optimisation packages available in Julia currently, namely BlackBoxOptim, CMAEvolutionStrategy, Evolutionary, Flux, GCMAES, MultistartOptimization, Metaheuristics, NOMAD, NLopt, Nonconvex, Optim and Quaddirect. Additionally the integration with ModelingToolkit and MathOptInterface allow it to leverage the state of art symbolic manipulation capabilities offered by these packages, specifically making use of it to construct the objective and constraint, jacobian and hessian efficiently. This talk will show how to use the Optimization.jl package, its various AD backends in combination with various optimiser backends. The interface is broken into three components, OptimizationFunction
, OptimizationProblem
and then solve
on a OptimizationProblem
. We will cover each of this components and discuss the pros and cons of choices available for specific problems. The focus would also be to show how such a flexible system is necessary for scientific machine learning by demonstrating some popular SciML models on real world problems.