Conference Agenda

Thursday, July 28, 2022

09:00 UTC

Keynote - Jeremy Howard

07/28/2022, 9:00 AM9:45 AM UTC
Green

Keynote - Jeremy Howard

10:30 UTC

Quiqbox.jl: Basis set generator for electronic structure problem

07/28/2022, 10:30 AM10:40 AM UTC
Green

Quiqbox.jl is a Julia package that allows highly customizable Gaussian-type basis set design for electronic structure problems in quantum chemistry and quantum physics. The package provides a variety of useful functions around basis set generation such as RHF and UHF methods, standalone 1-electron and 2-electron integrals, and most importantly, variational optimization for basis set parameters. It supports Linux, Mac OS, and Windows.

10:40 UTC

MathLink(Extras): The powers of Mathematica and Julia combined

07/28/2022, 10:40 AM10:50 AM UTC
Green

Mathematica is a powerful tool for many purposes, but it can be cumbersome to work with. This is especially clear for more automated tasks. In this short talk, I will introduce MathLink and MathLinkExtras, which enable interoperability between Julia and Mathematica. I will introduce the basic syntax of MathLink and discuss an application of automated computation of nested integrals.

10:50 UTC

Dates with Nanoseconds

07/28/2022, 10:50 AM11:00 AM UTC
Green

Julia's DateTime type is limited to milliseconds, while the Time type supports nanoseconds. This talk introduces NanoDate.jl and NanoDates. This type works like DateTime with higher precision. CompoundPeriods behave more smoothly and are available as an operational design element for developers.

11:00 UTC

Exploring audio circuits with ModelingToolkit.jl

07/28/2022, 11:00 AM11:30 AM UTC
Green

The study of audio circuits is interdisciplinary. It combines DSP, analog circuits, differential equations, and semiconductor theory. Mathematical tools like Fourier Transforms and standard circuit analysis cannot explain the behavior of stateful nonlinearities. A complete description of a circuit can only be obtained through time-domain (or ‘transient’, in SPICE terms) simulation. ModelingToolkit.jl enables rapid design iteration and combines features that traditionally require multiple tools.

11:30 UTC

Universal Differential Equation models with wrong assumptions

07/28/2022, 11:30 AM11:40 AM UTC
Green

The JuliaML ecosystem introduces an effective way to model natural phenomena with Universal Differential Equations. UDEs enrich differential equations combining an explicitly known term with a term learned from data via a Neural Network. Here, we explore what happens when our assumptions about the known term are wrong, making use of the rich interoperability of Julia. The insight we offer will be useful to the Julia community in better understanding strengths and possible shortcomings of UDEs.

11:40 UTC

Using SciML to predict the time evolution of a complex network.

07/28/2022, 11:40 AM11:50 AM UTC
Green

Modeling the temporal evolution of complex networks is still an open challenge across many fields. Using the SciML ecosystem in Julia, we train and simplify a Neural ODE on the low-dimensional embeddings of a temporal sequence of networks. In this way, we discover a dynamical system representation of the network that allows us to predict its temporal evolution. In the talk we’ll show how the tight integration of SciML, Network, and Matrix Algebra packages in Julia opens new modeling directions.

12:30 UTC

Fast, Faster, Julia: High Performance Implementation of the NFFT

07/28/2022, 12:30 PM1:00 PM UTC
Green

In this talk, we present the architecture of the NFFT.jl package, which implements the non-equidistant fast Fourier trans-form (NFFT). The NFFT is commonly implemented in C/C++ and requires sophisticated performance optimizations to exploit the full potential of the underlying algorithm. We demonstrate how Julia enables a high-performance, generic, and dimension-agnostic implementation with only a fraction of the code required for established C/C++ NFFT implementations.

12:30 UTC

Improvements in package precompilation

07/28/2022, 12:30 PM1:00 PM UTC
Blue

Julia code can be precompiled to save time loading and/or compiling it on first execution. Precompilation is nuanced because Julia code comes in many flavors, including source text, lowered code, type-inferred code, and various stages of optimization to reach native machine code. We will summarize what has (and hasn't) previously been precompiled, some of the challenges posed by Julia's dynamism, the nature of some recent changes, and prospects for near-term extensions to precompilation.

12:30 UTC

Adaptive Radial Basis Function Surrogates in Julia

07/28/2022, 12:30 PM1:00 PM UTC
Red

This talk focuses on an iterative algorithm, called active learning, to update radial basis function surrogates by adaptively choosing points across its input space. This work extensively uses the SciML ecosystem, and in particular, Surrogates.jl.

12:30 UTC

JuliaGPU

07/28/2022, 12:30 PM1:15 PM UTC
BoF

The JuliaGPU community welcomes both long-standing contributors and newcomers to a birth-of-the feather event on the state of the JuliaGPU ecosystem.

Join the discussion on the bof-voice channel in discord. Voice your feedback and experiences.

13:00 UTC

Tricks.jl: abusing backedges for fun and profit

07/28/2022, 1:00 PM1:10 PM UTC
Purple

Tricks.jl is a package that does cool tricks to do more work at compile time. It does this by generating (@generated) functions that just return "hardcoded" values, and then trigger the generation when (if) that value changes. This retriggering is done using backedges. Tricks.jl can for example declare Tim Holy traits that depend on whether or not a method has been defined Slides

13:00 UTC

Lux.jl: Explicit Parameterization of Neural Networks in Julia

07/28/2022, 1:00 PM1:10 PM UTC
Red

Julia already has quite a few well-established Neural Network Frameworks -- Flux & KNet. However, certain design elements -- Coupled Model and Parameters & Internal Mutations -- associated with these frameworks make them less compiler and user friendly. Making changes to address these problems in the respective frameworks would be too disruptive for users. To address these challenges, we designed Lux, a NN framework.

13:00 UTC

Julia's latest in high performance sorting

07/28/2022, 1:00 PM1:30 PM UTC
Green

This talk compares the runtime of Julia's builtin sorting with that of other languages and explains some of the techniques Julia uses to outperform other languages. This is a small part of the larger ongoing effort to equip Julia with state of the art and faster than state of the art performance for all sorting tasks.

13:00 UTC

Hunting down allocations with Julia 1.8's Allocation Profiler

07/28/2022, 1:00 PM1:10 PM UTC
Blue

Ever written code that was too slow because of excessive allocations, but didn't know where in your code they were coming from? Julia 1.8 introduces a new Allocation Profiler for finding and understanding sources of allocations in your julia programs, providing stack traces and type info for allocation hotspots. In this talk we will introduce the allocation profiler, cover how to use it, and talk through a small success story in our own codebase.

13:10 UTC

GraphPPL.jl: a package for specification of probabilistic models

07/28/2022, 1:10 PM1:20 PM UTC
Red

We present GraphPPL.jl - a package for user-friendly specification of probabilistic models with variational inference constraints. GraphPPL.jl creates a model as a factor graph and supports the specification of factorization and form constraints on the variational posterior for the latent variables. The package collection GraphPPL.jl, ReactiveMP.jl and Rocket.jl provide together a full reactive programming-based ecosystem for running efficient and customizable variational Bayesian inference.

13:10 UTC

Making Abstract Interpretation Less Abstract in Cthulhu.jl

07/28/2022, 1:10 PM1:20 PM UTC
Purple

Cthulhu.jl is a highly useful tool for performance engineering as well as general debugging of Julia programs. However, as the name implies, one can quickly descend into the abyss that is Julia's compilation pipeline and get lost in the vast amounts of code even modest looking Julia functions may end up generating. I present a combination of Cthulhu.jl with a step-by-step debugger, showing concrete results every step along the type lattice to make compilation more interpretable.

13:10 UTC

HighDimPDE.jl: A Julia package for solving high-dimensional PDEs

07/28/2022, 1:10 PM1:20 PM UTC
Blue

High-dimensional PDEs cannot be solved with standard numerical methods, as their computational cost increases exponentially in the number of dimensions. This problem, known as the curse of dimensionality, vanishes with HighDimPDE.jl. The package implements novel solvers that can solve non-local nonlinear PDEs in potentially up to 1000 dimensions.

13:20 UTC

TuringGLM.jl: Bayesian Generalized Linear models using @formula

07/28/2022, 1:20 PM1:30 PM UTC
Red

TuringGLM makes easy to specify Bayesian Generalized Linear Models using the formula syntax and returns an instantiated Turing model.

Example:

@formula(y ~ x1 + x2 + x3)

Heavily inspired by brms (uses RStan or CmdStanR) and bambi (uses PyMC3).

13:20 UTC

Solving transient PDEs in Julia with Gridap.jl

07/28/2022, 1:20 PM1:30 PM UTC
Blue

In this talk we present a new feature of Gridap.jl focusing on the solution of transient Partial Differential Equations (PDEs). We will show a new API that: a) leads to weak forms with very simple syntax, b) supports automatic differentiation, c) enables the solution of multi-field and DAE systems, and d) can be used in parallel computing through GridapDistributed.jl. We will showcase the novel features for a variety of applications in fluid and solid dynamics.

13:20 UTC

Reducing Running Time and Time to First X: A Walkthrough

07/28/2022, 1:20 PM1:30 PM UTC
Purple

Optimizing Julia isn't hard if you compare it to Python or R where you have to be an expert in Python or R and C/C++. I'll describe what type stability is and why it is important for performance. I'll discuss it in the context of performance (raw throughput) and in the context of time to first X (TTFX). Julia is sort of notorious for having really bad TTFX in certain cases. This talk explains the workflow that you can use to reduce running time and TTFX.

13:30 UTC

Garbage Collection in Julia.

07/28/2022, 1:30 PM1:40 PM UTC
Purple

Garbage collection is one of those productivity tools that you don't think about until you need to. We will discuss the current state of Julia GC and what can be done to make it better.

13:30 UTC

Julia Gaussian Processes

07/28/2022, 1:30 PM2:00 PM UTC
Green

Julia Gaussian Processes (Julia GPs) is home to an ecosystem of packages whose aim is to enable research and modelling using GPs in Julia. It specifies a variety of interfaces, code which implements these interfaces in standard settings, and code built on top of these interfaces (e.g. plotting). The composability and modularity of these interfaces distinguishes it from other GP software. This talk will explore the things that you can currently do with the ecosystem, and where it’s heading.

13:30 UTC

Text Segmentation with Julia

07/28/2022, 1:30 PM1:40 PM UTC
Red

Introducing TextSegmentation.jl, a package for Text Segmentation with Julia. Text Segmentation is a method of dividing an unstructured document including various contents into several parts according to its topics. So it is an important technique that supports various natural language processing tasks such as summarization, extraction, and question answering. If the audience listen to this presentation, they will learn Text Segmentation and how to use packages and be able to perform it easily.

13:30 UTC

Progradio.jl - Projected Gradient Optimization

07/28/2022, 1:30 PM1:40 PM UTC
Blue

Most (Mathematical) Optimization problems are subject to bounds on the decision variables. In general, a nonlinear cost function f(x) is to be minimized, with the vector x constrained by simple bounds l <= x <= u. The Projected Gradient class of methods is tailored for this very optimization problem. Our package includes various Projected Gradient methods, fully implemented in Julia. We make use of Julia's Iterator interface, allowing for user-defined termination criteria.

13:40 UTC

Transformer models and framework in Julia

07/28/2022, 1:40 PM1:50 PM UTC
Blue

An introduction to the Transformers.jl and relative packages for building transformer models.

13:40 UTC

Parallelizing Julia’s Garbage Collector

07/28/2022, 1:40 PM1:50 PM UTC
Purple

With the increasing popularity of Julia for memory intensive applications, garbage collection is becoming a performance bottleneck.

Julia currently uses a serial mark-and-sweep collector, in which objects are traced starting from a root-set (e.g. thread’s stacks, global variables, etc.) and unreachable objects are then deallocated.

We discuss in this talk how we recently parallelized tracing of live Julia objects and the performance improvements we got so far.

13:40 UTC

Recommendation.jl: Modeling User-Item Interactions in Julia

07/28/2022, 1:40 PM1:50 PM UTC
Red

Recommender system is a data-driven application that generates personalized content for users. This talk shows how Julia can be a deeply satisfying option to capture the unique characteristics of recommenders, which rely heavily on repetitive matrix computations in multi-stage data pipelines. To build trustworthy systems in terms of not only accuracy and scalability but usability and fairness at large, we particularly focus on API design and evaluation methods implemented on Recommendation.jl.

13:50 UTC

Unbake the Cake (and Eat it Too!): Flexible and Performant GC

07/28/2022, 1:50 PM2:00 PM UTC
Purple

The tension between performance and flexibility is always present when developing new systems. Often, poor performance is unacceptable. But poor flexibility hinders experimentation and evolution, which may lead to bad performance later on. In this talk, we show how we used MMTk.io – a toolkit we are developing that provides language implementers with a powerful garbage collection framework – to implement a flexible (unbaking the cake) and performant (and eating it too) memory manager for Julia.

13:50 UTC

G Research Sponsored Talk

07/28/2022, 1:50 PM1:55 PM UTC
Red

G-Research is Europe’s leading quantitative finance research firm

13:55 UTC

Pumas Sponsored Talk

07/28/2022, 1:55 PM2:00 PM UTC
Red

With deep expertise in allied fields of clinical pharmacology, pharmacometrics, drug development, regulations and advanced data analytics including machine learning, Pumas-AI works with companies, laboratories and universities as their healthcare intelligence partner.

14:30 UTC

Restreaming of Jeremy Howard Keynote

07/28/2022, 2:30 PM3:15 PM UTC
Green

Restreaming of the earlier Keynote by Jeremy Howard

15:20 UTC

oneAPI.jl: Programming Intel GPUs (and more) in Julia

07/28/2022, 3:20 PM3:30 PM UTC
Green

oneAPI.jl is a Julia package that makes it possible to use the oneAPI framework to program accelerators like Intel GPUs. In this talk, I will explain the oneAPI framework, which accelerators it supports, and demonstrate how oneAPI.jl makes it possible to work with these accelerators from the Julia programming language.

15:30 UTC

Julius Tech Sponsored Talk

07/28/2022, 3:30 PM3:45 PM UTC
Green

Julius offers an auto-scaling, low code graph computing solution that allows firms to quickly build transparent and adaptable data analytics pipelines.

15:45 UTC

Annual Julia Developer Survey

07/28/2022, 3:45 PM3:55 PM UTC
Green

Results of the Julia Developer Survey 2022

16:30 UTC

BlockDates: A Context-Aware Fuzzy Date Matching Solution

07/28/2022, 4:30 PM5:00 PM UTC
Green

We developed the open-source software package BlockDates using the Julia programming language to allow the extraction of fuzzy-matched dates from a block of text. The tool leverages contextual information and draws on external date data to find the best date matches. For each identified date, multiple interpretations are proposed and scored to find the best fit. The output includes several record-level variables that help explain the result and prioritize error detection.

16:30 UTC

Automating Reinforcement Learning for Solving Economic Models

07/28/2022, 4:30 PM4:40 PM UTC
Blue

I present a new package which aims to automate the process of using reinforcement learning to solve discrete-time heterogeneous-agent macroeconomic models. Models with discrete choice, matching, aggregate uncertainly, and multiple locations are supported. The pure-Julia package, tentatively named Bucephalus.jl, also defines a data structure for describing this class of models, allowing new solvers to be easily implemented and models to be defined once and solved many ways.

16:30 UTC

Unlocking Julia's LLVM JIT Compiler

07/28/2022, 4:30 PM4:40 PM UTC
Purple

Julia's compiler spends almost all of its time generating, optimizing, and compiling LLVM IR. Currently, much of this work is done under one giant lock, which is also held during type inference, reducing compiler throughput in a multithreaded environment. By using finer-grained locking and handling LLVM IR in a threadsafe manner, we can reduce contention of compilation resources. This work also leads into future JIT optimizations such as lazy, parallel, and speculative compilation of Julia code.

16:30 UTC

Julia in HPC

07/28/2022, 4:30 PM6:00 PM UTC
BoF

The Julia HPC community has been growing over the last years with monthly meetings to coordinate development and to solve problems arising in the use of Julia for high-performance computing.

The Julia in HPC Birds of a Feather is an ideal opportunity to join the community and to discuss your experiences with using Julia in an HPC context.

Note: We will host the BoF via Zoom and share the meeting link 15 min before start time in the #hpc channels of JuliaCon Discord and Julia Slack.

16:30 UTC

Improving nonlinear programming support in JuMP

07/28/2022, 4:30 PM5:00 PM UTC
JuMP

In JuMP 1.0, support for nonlinear programming is a second-class citizen. You must use the separate @NL macros, the automatic differentiation engine is a JuMP-specific implementation that cannot be swapped for alternative implementations, and vector-valued nonlinear expressions are not supported. In this talk, we discuss our plans and progress to address these issues and make nonlinear programming a first-class citizen. This work is supported by funding from Los Alamos National Laboratory.

16:30 UTC

HPC sparse linear algebra in Julia with PartitionedArrays.jl

07/28/2022, 4:30 PM4:40 PM UTC
Red

PartitionedArrays is a distributed sparse linear algebra engine that allows Julia users to easily prototype and deploy large computations on distributed-memory HPC platforms. The long-term goal is to provide a Julia alternative to the parallel vectors and sparse matrices available in well-known distributed algebra packages such as PETSc. Using PartitionedArrays, application libraries have shown excellent strong and weak scaling results up to tends of thousands of CPU cores.

16:40 UTC

Bender.jl: A utility package for customizable deep learning

07/28/2022, 4:40 PM4:50 PM UTC
Blue

A wide range of research on feedforward neural networks requires "bending" the chain rule during backpropagation. The package Bender.jl provides neural network layers (compatible with Flux.jl), which gives users more freedom to choose every aspect of the forward mapping. This makes it easy to leverage ChainRules.jl to compose a wide range of experiments, such as training binary neural networks, Feedback Alignment and Direct Feedback Alignment in just a few lines of code.

16:40 UTC

Metal.jl - A GPU backend for Apple hardware

07/28/2022, 4:40 PM4:50 PM UTC
Purple

In this talk, updates on the development of a GPU backend for Apple hardware (specifically the M-series chipset) will be presented along with a brief showcase of current capabilities and interface. The novel compilation flow will be explained and compared to the other GPU backends as well as the benefits and limitations of both a unified memory model and Apple's Metal capabilities. A brief overview of Apple's non-GPU hardware accelerators and their potential will also be discussed.

16:40 UTC

Calling Julia from MATLAB using MATDaemon.jl

07/28/2022, 4:40 PM4:50 PM UTC
Red

MATLAB is a proprietary programming language popular for scientific computing. Calling MATLAB code from Julia via the C API has been supported for many years via MATLAB.jl. The reverse direction is more complex. One approach is to compile Julia via the C++ MEX API as in Mex.jl. In MATDaemon.jl (https://bit.ly/3JxTFFU), we instead communicate by writing data to .mat files. This method is robust across Julia and MATLAB versions, and easy to use: just download jlcall.m from the GitHub repository.

16:50 UTC

LinearSolve.jl: because A\b is not good enough

07/28/2022, 4:50 PM5:00 PM UTC
Red

Need to solve Ax=b for x? Then use A\b! Or wait, no. Don't. If you use that method, how do you swap that out for a method that performs GPU offloading? How do you switch between UMFPACK and KLU for sparse matrices? Krylov subspace methods? What does all of this mean and why is A\b not good enough? Find out all of this and more at 11. P.S. LinearSolve.jl is the answer.

16:50 UTC

ArrayAllocators.jl: Arrays via calloc, NUMA, and aligned memory

07/28/2022, 4:50 PM5:00 PM UTC
Purple

ArrayAllocators.jl uses the standard array interface to allow faster zeros with calloc, allocation on specific NUMA nodes on multi-processor systems as well as aligned memory. The allocators are given as an argument to Array{T} in place of undef. Overall, this allows Julia to match the allocation performance of popular numerical libraries such as NumPy, which uses some of these techniques. In this talk, we will also explore some of the unexpected properties of these allocation methods.

16:50 UTC

Effortless Bayesian Deep Learning through Laplace Redux

07/28/2022, 4:50 PM5:00 PM UTC
Blue

Treating deep neural networks probabilistically comes with numerous advantages including improved robustness and greater interpretability. These factors are key to building artificial intelligence (AI) that is trustworthy. A drawback commonly associated with existing Bayesian methods is that they increase computational costs. Recent work has shown that Bayesian deep learning can be effortless through Laplace approximation. This talk presents an implementation in Julia: BayesLaplace.jl.

17:00 UTC

Compile-time programming with CompTime.jl

07/28/2022, 5:00 PM5:30 PM UTC
Purple

Inspired by the compile-time features of Zig, we present a CompTime.jl, a package that wraps Julia’s features for generated functions into a seamless interface between compile-time and runtime semantics. The desire for this came from heavy use of @generated functions within Catlab.jl, and we have found that CompTime.jl makes our code more readable, maintainable, and debuggable. We will give a tutorial and then a brief peek into the implementation.

17:00 UTC

CALiPPSO.jl: Jamming of Hard-Spheres via Linear Optimization

07/28/2022, 5:00 PM5:10 PM UTC
Red

The CALiPPSO.jl package implements a new algorithm for producing disordered spheres packings with very high accuracy. The algorithm reaches the critical jamming point of hard spheres through a chain of constrained linear optimization problems. CALiPPSO.jl exploits the functionality of JuMP for modelling and is thus compatible with several optimizers. In collaboration with C. Artiaco, G. Parisi, and F. Ricci Tersenghi.

17:00 UTC

Large-Scale Machine Learning Inference with BanyanONNXRunTime.jl

07/28/2022, 5:00 PM5:10 PM UTC
Blue

BanyanONNXRunTime.jl is an open-source Julia package for running PyTorch/TensorFlow models on large distributed arrays. In this talk, we show how you can use BanyanONNXRunTime.jl with BanyanDataFrames.jl for running ML models on tabular data and with BanyanImages.jl for running ML models on image data.

17:00 UTC

Benchmarking Nonlinear Optimization with AC Optimal Power Flow

07/28/2022, 5:00 PM5:30 PM UTC
JuMP

This work discusses some of the requirements for deploying non-convex nonlinear optimization methods to solve large-scale problems in practice. AC Optimal Power Flow is proposed as a proxy-application for testing the viability of nonlinear optimization frameworks for solving such problems. The current performance of several Julia frameworks for nonlinear optimization is evaluated using a standard benchmark library for AC Optimal Power Flow.

17:00 UTC

An introduction to BOMBs.jl.

07/28/2022, 5:00 PM5:10 PM UTC
Green

Mathematical models are crucial to build and predict the behaviour of new biological systems. However, selecting between plausible model candidates or estimate parameters is an arduous job, especially considering the different informative content of experiments. BOMBs.jl is a package to automate model simulations, pseudo-data generation, maximum likelihood estimation and Bayesian inference of parameters (Stan and Turing.jl), and design optimal experiments for model selection and inference.

17:10 UTC

Writing a GenericArpack library in Julia.

07/28/2022, 5:10 PM5:40 PM UTC
Red

Arpack is a library for computing eigenvalues and eigenvectors of a linear operator. It has been used in many technical computing packages. The goal of the GenericArpack.jl package is to create a Julia translation of Arpack. Right now, the Julia GenericArpack.jl methods produce bitwise identical results to the Arpack_jll methods for Float64 types in all testcases. The new library has zero dependency on BLAS and supports element types beyond those in Arpack, such as DoubleFloats.jl.

17:10 UTC

SpeedyWeather.jl: A 16-bit weather model with machine learning

07/28/2022, 5:10 PM5:20 PM UTC
Blue

We present SpeedyWeather.jl, a global atmospheric model currently developed as a prototype for a 16-bit climate model incorporating machine learning for accuracy and computational efficiency on different hardware. SpeedyWeather.jl is designed for type flexibility with low precision, and automatic differentiation to replace parts of the model with neural networks for a more accurate representation of climate processes and computational efficiency.

17:10 UTC

Build, Test, Sleep, Repeat: Modernizing Julia's CI pipeline

07/28/2022, 5:10 PM5:20 PM UTC
Green

Julia's Continuous Integration pipeline has struggled for many years now as the needs of the community have significantly outgrown the old Buildbot system. In this talk we will detail the efforts of the CI dev team to provide reliability, reproducibility, security, and greater introspective ability in our CI builds. These CI improvements aren't just helping the Julia project itself, but also other related open-source projects, as we continue to generate self-contained, useful building blocks.

17:20 UTC

ExplainableAI.jl: Interpreting neural networks in Julia

07/28/2022, 5:20 PM5:30 PM UTC
Blue

In pursuit of interpreting black-box models such as deep image classifiers, a number of techniques have been developed that attribute and visualize the importance of input features with respect to the output of a model. ExplainableAI.jl brings several of these methods to Julia, building on top of primitives from the Flux ecosystem. In this talk, we will give an overview of current features and show how the package can easily be extended, allowing users to implement their own methods and rules.

17:20 UTC

Extreme Value Analysis in Julia with Extremes.jl

07/28/2022, 5:20 PM5:50 PM UTC
Green

In this talk, we present Extremes.jl, a package that provides exhaustive high-performance functions for the statistical analysis of extreme values with Julia. Parameter estimation, diagnostic tools for assessing model accuracy and high quantile estimation are implemented for stationary and non-stationary extreme value models. The functionalities will be illustrated in this talk by reproducing many results from the popular book of Coles (2001).

17:30 UTC

Monitoring Performance on a Hardware Level With LIKWID.jl

07/28/2022, 5:30 PM6:00 PM UTC
Purple

Have you ever wondered how many FLOPS your CPU or GPU actually performs when executing (parts of) your Julia code? Or how much data it has read from main memory or a certain cache? Then this talk is for you! I will present LIKWID.jl (Like I Knew What I'm Doing), a Julia wrapper around the same-named performance benchmarkig suite, that allows you to analyse the performance of your Julia code by monitoring various hardware performance counters sitting inside of your CPU or GPU.

17:30 UTC

Advances in Transformations and NLP Modeling for InfiniteOpt.jl

07/28/2022, 5:30 PM6:00 PM UTC
JuMP

InfiniteOpt.jl is built on a unifying abstraction for infinite-dimensional optimization problems that enable it to tackle a wide variety of problems in innovative ways. We present recent advances to InfiniteOpt.jl that significantly its flexibility to model/solve these challenging problems. We have developed a general transformation API to facilitate diverse solution methodologies, and we have created an intuitive nonlinear interface that overcomes the current shortcomings of JuMP.jl.

17:30 UTC

Training Spiking Neural Networks in pure Julia

07/28/2022, 5:30 PM5:40 PM UTC
Blue

Training artificial neural networks to recapitulate the dynamics of biological neuronal recordings has become a prominent tool to understand computations in the brain. We present an implementation of a recursive-least squares algorithm to train units in a recurrent spiking network. Our code can reproduce the activity of 50,000 neurons of a mouse performing a decision-making task in less than an hour of training time. It can scale to a million neurons on a GPU with 80 GB of memory.

17:40 UTC

Simple Chains: Fast CPU Neural Networks

07/28/2022, 5:40 PM5:50 PM UTC
Blue

SimpleChains is an open source pure-Julia machine learning library developed by PumasAI and JuliaComputing in collaboration with Roche and the University of Maryland, Baltimore. It is specialized for relatively small-sized models and NeuralODEs, attaining best in class performance for these problems. The performance advantage remains significant when scaling to tens of thousands of parameters, where it's still >5x faster than Flux or Pytorch while all use a CPU, even outperforming GPUs.

17:50 UTC

Manopt.jl – Optimisation on Riemannian manifolds

07/28/2022, 5:50 PM6:00 PM UTC
Green

Manopt.jl provides a set of optimization algorithms for problems given on a Riemannian manifold. Build upon on a generic optimization framework, together with the interface ManifoldsBase.jl for Riemannian manifolds, classical and recently developed methods are provided in an efficient implementation. This talk will also present some algorithms implemented in the package.

18:00 UTC

GatherTown -- Social break

07/28/2022, 6:00 PM7:00 PM UTC
Green

Join us on Gather.town for a social hour.

19:00 UTC

BoF - JuliaLang en Español

Enhorabuena, ha llegado el momento de tener un foro dedicado para los usuarios de JuliaLang en español. Discutiremos:

  • foros y centros donde se usa Julia en español
  • materiales educativos (cursos, libros, artículos, video tutoriales), y planes a futuro
  • diversidad, inclusión y apoyo de hispano-parlantes

Join the discussion on the bof-voice channel in discord.

19:00 UTC

Automated Geometric Theorem Proving in Julia

07/28/2022, 7:00 PM7:10 PM UTC
Blue

This talk introduces GeometricTheoremProver.jl, a Julia package for automated deduction in Euclidean geometry. The talk will give a short overview of geometric theorem proving concepts and hands-on demos on how to use the package to write and prove statements in Euclidean geometry. A roadmap of the package for future development plans will also be presented.

19:00 UTC

Platform-aware programming in Julia

07/28/2022, 7:00 PM7:30 PM UTC
Purple

Heterogeneous computing resources, such as GPUs, TPUs, and FPGAs, are widely used to accelerate computations, or make them possible, in scientific/technical computing. We will talk about how loose addressing of heterogeneous computing requirements in programming language designs affects portability and modularity. We propose contextual types to answer the underlying research questions, where programs are typed by their execution platforms and Julia's multiple dispatch plays an essential role.

19:00 UTC

Julius Tech Sponsored Forum

07/28/2022, 7:00 PM7:45 PM UTC
Sponsored forums

Enterprise adoption for Julia can be a difficult process for developers and engineers to champion. In this sponsored forum, we invite leading industry experts to talk about the common challenges organizations face when bringing Julia and Julia based solutions onboard. Join here.

19:00 UTC

PyCallChainRules.jl: Reusing differentiable Python code in Julia

07/28/2022, 7:00 PM7:10 PM UTC
Green

While Julia is great, there are still a lot of existing useful differentiable Python code in PyTorch, Jax, etc. Given PyCall.jl is already so great and seamless, one might wonder what it takes to differentiate through those calls to Python functions. PyCallChainRules.jl aims for that ideal. DLPack.jl is leveraged to pass CPU or GPU arrays without any copy between Julia and Python.

19:00 UTC

OnlineSampling : online inference on reactive models

07/28/2022, 7:00 PM7:30 PM UTC
Red

OnlineSampling.jl is a Julia package for online Bayesian inference on reactive models, i.e., streaming probabilistic models.

Online sampling provides 1) a small macro based domain specific language to describe reactive models and 2) a semi-symbolic inference algorithm which combines exact solutions using Belief Propagation for trees of Gaussian random variables, and approximate solutions using Particle Filtering.

19:00 UTC

The JuliaSmoothOptimizers (JSO) Organization

07/28/2022, 7:00 PM7:30 PM UTC
JuMP

The JSO organization is a set of Julia packages for smooth, nonsmooth optimization, and numerical linear algebra intended to work consistently together and exploit the structure present in problems. It provides modeling facilities, widely useful known methods, either in the form of interfaces or pure Julia implementations, but also unique methods that are the product of active research. We review the main features of JSO, its current status, and hint at future developments.

19:10 UTC

SIMD-vectorized implementation of high order IRK integrators

07/28/2022, 7:10 PM7:20 PM UTC
Blue

We present a preliminary version of a SIMD-vectorized implementation of the sixteenth order 8-stage implicit Runge-Kutta integrator IRKGL16 implemented in the Julia package IRKGaussLegendre.jl. For numerical integrations of typical non-stiff problems performed in double precision, we show that a vectorized implementation of IRKGL16 that exploits the SIMD-based parallelism can clearly outperform high order explicit Runge-Kutta schemes available in the standard package DifferentialEquations.jl.

19:10 UTC

Cosmological Emulators with Flux.jl and DifferentialEquations.jl

07/28/2022, 7:10 PM7:20 PM UTC
Green

In the next decade, forthcoming galaxy surveys will provide the astrophysical community with an unprecedented wealth of data. The standard analysis pipeline, usually employed to analyze this kind of surveys, are quite expensive from a computational point of view. In this presentation I will show how, using Flux.jl and DiffEquations.jl, it is possible to accelerate standard analysis of some order of magnitudes.

19:20 UTC

Automatic Differentiation for Solid Mechanics in Julia

07/28/2022, 7:20 PM7:30 PM UTC
Green

Automatic Differentiation (AD) is widely applied in many different fields of computer science and engineering to accurately evaluate derivatives of functions expressed in a computer programming language. In this talk we illustrate the use of AD for the solution of Finite Elements (FE) problems with special emphasis on solid mechanics.

19:20 UTC

Zero knowledge proofs of shuffle with ShuffleProofs.jl

07/28/2022, 7:20 PM7:30 PM UTC
Blue

Many remote electronic voting systems use the ElGamal re-encryption mixnet as the foundation of their design, motivated by a number of ways authorities can be held accountable. In particular, zero-knowledge proofs of shuffle as implemented in the Verifiactum library offer an elegant and well-established solution. In ShuffleProofs.jl, I implement a Verificatum compatible verifier and prover for non-interactive zero-knowledge proofs of shuffle, making it more accessible, as I shall demonstrate.

19:30 UTC

MagNav.jl: airborne Magnetic anomaly Navigation

07/28/2022, 7:30 PM7:40 PM UTC
Blue

MagNav.jl is an open-source Julia package that contains a full suite of tools for aeromagnetic compensation and airborne magnetic anomaly navigation. This talk will describe the high-level functionalities of the package, then provide a brief tutorial using real flight data that is available within the package. The functionalities can be divided into the four essential components of MagNav: sensors (flight data), magnetic anomaly maps, aeromagnetic compensation models, and navigation algorithms.

19:30 UTC

ChainRules.jl meets Unitful.jl: Autodiff via Unit Analysis

07/28/2022, 7:30 PM7:40 PM UTC
Green

Tools for performing autodifferentiation (AD) and dimensional work in Julia are robust, but not always compatible. This talk explores how we can understand rule-based AD in Julia by showing how to make dimensional quantities from Unitful.jl compose with ChainRules.jl. Combining these two projects produces an intuitive look at the building blocks of AD in Julia using only rudimentary calculus and dimensional analysis.

19:30 UTC

Optimizing Floating Point Math in Julia

07/28/2022, 7:30 PM8:00 PM UTC
Purple

Why did exp10 get 2x faster in Julia 1.6? One reason is, unlike most other languages, Julia doesn't use the operating system-provided implementations for math (Libm). This talk will be an overview of improvements in Julia's math library since version 1.5, and areas for future improvements. We will cover will be computing optimal polynomials, table based implementations, and bit-hacking for peak performance.

19:30 UTC

PDE-constrained optimization using JuliaSmoothOptimizers

07/28/2022, 7:30 PM8:00 PM UTC
JuMP

In this presentation, we showcase a new optimization infrastructure within JuliaSmoothOptimizers for PDE-constrained optimization problems in Julia. We introduce PDENLPModels.jl a package that discretizes PDE-constrained optimization problems using finite elements methods via Gridap.jl. The resulting problem can then be solved by solvers tailored for large-scale optimization implemented in pure Julia such as DCISolver.jl and FletcherPenaltyNLPSolver.jl.

19:30 UTC

Dynamical Low Rank Approximation in Julia

07/28/2022, 7:30 PM7:40 PM UTC
Red

We present LowRankArithmetic.jl and LowRankIntegrators.jl. The conjunction of both packages forms the backbone of a computational infrastructure that enables simple and non-intrusive use of dynamical low rank approximation for on-the-fly compression of large matrix-valued data streams or the approximate solution of otherwise intractable matrix-valued ODEs. We showcase the utility of these packages for the quantification of uncertainty in scientific models.

19:40 UTC

Using Optimization.jl to seek the optimal optimiser in SciML

07/28/2022, 7:40 PM8:10 PM UTC
Green

Optimization.jl seeks to bring together all of the optimization packages it can find, local and global, into one unified Julia interface. This means, you learn one package and you learn them all! Optimization.jl adds a few high-level features, such as integrating with automatic differentiation, to make its usage fairly simple for most cases, while allowing all of the options in a single unified interface.

19:40 UTC

Visualization Dashboards with Pluto!

07/28/2022, 7:40 PM7:50 PM UTC
Red

Data visualization with intuitive interactions is an essential feature of many scientific investigations. I propose to go over use cases and examples on why/how to develop reactive dashboards in Julia using "Pluto.jl". Pluto provides a way to isolate cells in a separate page of which the style is editable as regular HTML/CSS. Alongside PlutoUI's experimental layout feature, this is a powerful tool to create immersive interactive experiences for users.

19:40 UTC

Validating a tsunami model for coastal inundation

07/28/2022, 7:40 PM7:50 PM UTC
Blue

How do we trust that a given fluid model is suitable for simulating water waves as they approach and wash over the land? This talk presents some of the benchmark tests used to validate a tsunami model. Using our Julia implementation of a fluid model, we check how well it conserves mass, matches analytical solutions, and reproduces laboratory experiments.

19:50 UTC

JCheck.jl: Randomized Property Testing Made Easy

07/28/2022, 7:50 PM8:00 PM UTC
Blue

JCheck is a native Julia implementation of a randomized property testing (RPT) framework. It aims at integrating as seamlessly as possible to the Test.jl package in order to enable developers to easily use RPT along with more "traditional" approaches. Although a fair number of generators are included, designing novel ones for custom data types is a straightforward process. Additional features such as shrinkage and specification of "special" non-random input are available.

19:50 UTC

Visualizing astronomical data with AstroImages.jl

07/28/2022, 7:50 PM8:00 PM UTC
Red

To study the cosmos, astronomers examine images captured of light exceeding human-visible colors and dynamic range. AstroImages.jl makes it easy to load, manipulate, and visualize astronomical data intuitively and efficiently using arbitrary color-schemes, stretched color scales, RGB composites, PNG rendering, and plot recipes. Come to our talk to see how you too can create beautiful images of the universe!

20:00 UTC

Juliaup - The Julia installer and version multiplexer

07/28/2022, 8:00 PM8:10 PM UTC
Blue

This talk will present a deep dive into juliaup, the upcoming new official Julia installer and version multiplexer. The talk will give a brief presentation of the features of Juliaup, and then dive into design decision, integration with existing system package managers and an outlook of planned future work.

20:00 UTC

Generalized Disjunctive Programming via DisjunctiveProgramming

07/28/2022, 8:00 PM8:30 PM UTC
JuMP

We present a Julia package (DisjunctiveProgramming.jl) that extends the functionality in JuMP to allow modeling problems via logical propositions and disjunctive constraints. Logical propositions are converted into algebraic expressions by converting the Boolean expressions to Conjunctive Normal Form and then to algebraic inequalities. The package allows the user to specify the technique to reformulate the disjunctions (Big-M or Convex-Hull reformulation) into mixed-integer constraints.

20:00 UTC

Microbiome.jl & BiobakeryUtils.jl for analyzing metagenomic data

07/28/2022, 8:00 PM8:10 PM UTC
Red

Microbiome.jl is a Julia package to facilitate analysis of microbial community data. BiobakeryUtils.jl is built on top of Microbiome.jl, and provides utilities for working with a suite of command line tools (the bioBakery) that are widely used for converting raw metagenomic sequencing data into tables of taxon and gene function counts. Together, these packages provide an effective way to link microbial community data with the power of Julia’s numerical, statistical, and plotting libraries.

20:00 UTC

JuliaSyntax.jl: A new Julia compiler frontend in Julia

07/28/2022, 8:00 PM8:30 PM UTC
Purple

JuliaSyntax.jl is a new Julia language frontend designed for precise error reporting, speed and flexibility. In this talk we'll tour the JuliaSyntax parser implementation and tree data structures, highlighting benefits for users and tool builders. We'll discuss how to losslessly map Julia source text for character-precise error reporting and how a "parse stream" abstraction cleanly separates the parser from syntax tree creation while being 10x faster than Julia's reference parser.

20:10 UTC

Contributing to Open Source with Technical Writing.

07/28/2022, 8:10 PM8:20 PM UTC
Blue

The goal of this talk is to enlighten members of the Julia ecosystem on how they can make an impact by contributing to open source with technical writing. While this talk would be targeted at beginners, there would be something for even the more experienced members.

20:30 UTC

GatherTown -- Social break

07/28/2022, 8:30 PM9:30 PM UTC
Green

Join us on Gather.town for a social hour.

Platinum sponsors

Julia ComputingRelational AIJulius Technology

Gold sponsors

IntelAWS

Silver sponsors

Invenia LabsBeacon BiosignalsMetalenzASMLG-ResearchConningPumas AIQuEra Computing Inc.Jeffrey Sarnoff

Media partners

Packt PublicationGather TownVercel

Community partners

Data UmbrellaWiMLDS

Fiscal Sponsor

NumFOCUS