Minisymposium Presentation
Enzyme.jl: High-Performance, Cross-Language, and Parallel Automatic Differentiation in Julia
Description
Automatic differentiation (AD) is key to training neural networks, Bayesian inference, and scientific computing. Applying these techniques requires rewriting code in a specific machine learning framework or manually providing derivatives. This talk presents Enzyme, a high-performance automatic differentiation compiler plugin for the low-level virtual machine (LLVM) compiler capable of synthesizing gradients of programs expressed in the LLVM intermediate representation (IR). Enzyme differentiates programs in any language whose compiler targets LLVM, including C/C++, Fortran, Julia, Rust, JaX, Swift, etc., thereby providing native AD capabilities in these languages with state-of-the-art performance. Unlike traditional tools, Enzyme performs AD on optimized IR. We show that AD on optimized IR achieves a geometric mean speedup of 4.2x over AD on IR before optimization, and orders of magnitude speedups on GPU accelerator codes.
This talk will discuss AD from the lens of Enzyme.jl, Julia bindings for Enzyme. While Enzyme is applicable to any LLVM-based programming language, working within Julia presents several opportunities and challenges. Julia makes it easy to write generic code that can be automatically retargeted for any backend, without the programmer needing to also become an expert in these models. This flexibility, however, comes at a cost of just-in-time compilation, and garbage collection.