Scalable Uncertainty Quantification for Deep Molecular Models
Please login to view abstract download link
Molecular modeling can provide atomistic-level insight into complex physical phenomena. The quality of predictions critically depends on the employed model that defines particle interactions. A class of models with tremendous success in recent years are neural network (NN) potentials due to their flexibility and capacity to learn many-body interactions [1]. At the same time, these models are prone to overfitting, exacerbating the need for accurate and computationally efficient Uncertainty Quantification (UQ) models. In this talk, I will present the current state-of-the-art in deep molecular modeling. Using various test case materials, I will demonstrate how scalable Bayesian methods can be used to accurately quantify the uncertainty of MD simulation predictions [2]. Furthermore, I will present JaxSGMC [3], an application-agnostic library for stochastic gradient Markov chain Monte Carlo (SG-MCMC) in JAX. The implementation of several state-of-the-art samplers and the possibility to build custom samplers from standard building blocks facilitates the development and application of Bayesian deep learning approaches across a broad range of domains. REFERENCES [1] S. Thaler and J. Zavadlav, Learning neural network potentials from experimental data via Differentiable Trajectory Reweighting, Nat. Commun 12, 2021 [2] S. Thaler, G. Doehner, and J. Zavadlav, Scalable Bayesian Uncertainty Quantifi- cation for Neural Network Potentials: Promise and Pitfalls, JCTC 19, 4520-4532, 2023 [3] S. Thaler, P. Fuchs, A. Cukarska and J. Zavadlav, Jaxsgmc: Modular Stochastic Gradient MCMC in Jax, 2023, DOI: 10.2139/ssrn.4523404