ECCOMAS 2024

Conditional Expectation and ROMs

  • Matthies, Hermann (TU Braunschweig)

Please login to view abstract download link

Many times one treats systems which have parameters; they may be parameters which one can use to control or optimise the system, or they may be imposed from outside, and could be following some probability distribution. This last case may be taken as the “Leitmotiv” for the following. The reduced order model (ROM) is produced from the full order model (FOM) by some kind of projection onto a (relatively) low-dimensional manifold — like low-rank tensors — or in the simpler cases a subspace. The parameter dependent ROM reduction process produces a function of the parameter set into the ROM manifold which in some way approximates the FOM state for that parameter value. It is now of interest to examine the relation between the FOM state and the ROM approximation for all possible parameter values of interest. In the field of machine learning, also a function of the parameter set into the ROM manifold — the image space of the machine learning model — is learned. This is done on a training set of samples, typically minimising the mean-square error. The training set may be seen as a sample from some probability distribution, and thus the training is an approximation computation of the expectation. This thus produces an approximation to the conditional expectation. The conditional expectation is a special case of an Bayesian updating, where the Bayesian loss function is the mean-square error. This offers the possibility of having a combined look at these methods, and also introducing more general loss functions. It additionally offers the possibility to address uncertainty in the ROM in an intrinsic manner.