figshare
Browse

Merging Differential Equations with Machine Learning through Differentiable Programming

Download (16.19 MB)
presentation
posted on 2020-08-03, 01:13 authored by Christopher RackauckasChristopher Rackauckas
Scientific Machine Learning (SciML) is an emerging discipline which merges the mechanistic models of science and engineering with non-mechanistic machine learning models to solve problems which were previously intractable. Recent results have showcased how methods like Physics Informed Neural Networks (PINNs) can be utilized as a data-efficient learning method, embedding the structure of physical laws as a prior into a learnable structures so that small data and neural networks can sufficiently predict phenomena. Additionally, deep learning embedded within backwards stochastic differential equations has been shown to be an effective tool for solving high-dimensional partial differential equations, like the Hamilton-Jacobian-Bellman equation with 1000 dimensions. In this poster we will introduce the audience to these methods and show how these diverse methods are all instantiations of a neural differential equation, a differential equation where all or part of the equation is described by a latent neural network. Once this is realized, we will show how a computational tool, DiffEqFlux.jl, is being optimized to allow for efficient training of a wide variety of neural differential equations, explaining how the performance properties of these equation differ from more traditional uses of differential equations and some of the early results of optimizing for this domain. The audience will leave knowing how neural differential equations and DiffEqFlux.jl may be a vital part of next-generation scientific tooling.

History