figshare
Browse
Doing_SciML.pdf (4.35 MB)

Doing Scientific Machine Learning with Julia’s SciML Ecosystem

Download (4.35 MB)
presentation
posted on 2020-08-03, 00:48 authored by Christopher RackauckasChristopher Rackauckas
Scientific machine learning combines differentiable programming, scientific simulation, and machine learning in order impose physical constraints on machine learning and automatically learn biological models. Given the composibility of Julia, it is positioned as the best language for this set of numerical techniques, but how to do actually "do" SciML? This workshop gets your hands dirty.

Join via Zoom: link in email from Eventbrite. Backup Youtube link: https://youtu.be/QwVO0Xh2Hbg

In this workshop we'll dive into some of the latest techniques in scientific machine learning, including Universal Differential Equations ([Universal Differential Equations for Scientific Machine Learning](https://arxiv.org/abs/2001.04385)), Physics-Informed Neural Networks ([Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations](https://www.sciencedirect.com/science/article/pii/S0021999118307125)), and Sparse Identification of Nonlinear Dynamics (SInDy, [Discovering governing equations from data by sparse identification of nonlinear dynamical systems](https://www.pnas.org/content/113/15/3932)). The goal is to get those in the workshop familiar with what these methods are, what kinds of problems they solve, and know how to use Julia packages to implement them.

The workshop will jump right into how to model the missing part of a physical simulation, describe how universal approximators (neural networks) can be used in this context, and show how to transform such problems into an optimization problem which is then accelerated by specializing automatic differentiation. The set of packages that is involved in this is somewhat intense, using many tools from JuliaDiffEq ([DiffEqFlux.jl](https://diffeqflux.sciml.ai/dev/), DifferentialEquations.jl, DiffEqSensitivity.jl, ModelingToolkit.jl, [NeuralPDE.jl](https://neuralpde.sciml.ai/dev/), DataDrivenDiffEq.jl, Surrogates.jl, etc.) combined with machine learning tools (Flux.jl), differentiation tooling (SparseDiffTools.jl, Zygote.jl, ForwardDiff.jl, ReverseDiff.jl, etc.), and optimization tooling (JuMP, Optim.jl, Flux.jl, NLopt.jl, etc.) all spun together in a glorious soup that automatically discovers physical laws at the end of the day. Thus this workshop has something different to offer for everyone: new users of Julia will get a nice overview of the unique composibility of the Julia package ecosystem, while experienced Julia users will learn how to bridge some area that they are comfortable with (such as machine learning) to a whole new set of phenomena. Meanwhile, even though who are only knee deep in coding can gain a lot from learning these new mathematical advances, meaning that even a casual observer likely has a lot to learn!

See https://github.com/mitmath/18337 and https://github.com/mitmath/18S096SciML as MIT courses that were recently developed for teaching Scientific Machine Learning: what I plan to do here is a condensed version that leaves out a lot of the details of how the libraries were developed (the main content of the math course) and instead focuses directly on how to use these methods in day-to-day technical computing.

History