PDE-Constrained manifold Gaussian Processes for High-Dimensional Problems
Please login to view abstract download link
The task of modeling and numerically experimenting with high-dimensional partial differential equations (PDEs) has garnered significant attention. The key idea for this work centers on leveraging PDE information to construct deep kernel architectures with facilitated uncertainty quantification. In particular, a probabilistic machine learning tool -- \emph{manifold Gaussian Processes/ deep kernel learning}-- is used to embed neural network structures into Gaussian processes. Our method incorporates governing PDEs into DKL and maps data into a low-dimensional feature space, which not only enhances the ease of regression but also extends PDE-constrained Gaussian processes with improved nonlinear expressivity. Numerical results show that such an integration promises enhanced robustness and versatility in surrogate modeling. Crucial technical aspects include the selection of appropriate network structures, optimal parameter initialization, and meticulous hyperparameter tuning. We propose tailored optimization settings to tackle the challenges in such a high-dimensional approximation task.