Towards universal Neural ODEs.
Please login to view abstract download link
Both differential equations and Neural Networks have great capabilities in modelling complex systems. While it is nearly impossible to find a governing ODE for physically complex problems, it is also not trivial how time series data can be learned using conventional Neural Networks. Neural ODEs [1] consist of a combination of both realms, where a Neural Network is used to predict a governing ODE. While Neural ODEs are very effective on simple, linear datasets, training is more challenging when nonlinearities are present (e.g. [2]). In fact, training Neural ODEs can be challenging even for equation-generated nonlinear problems. The failure of Neural ODEs is especially evident when learning on data that has both slow/fast dynamics, where they tend to learn the slow dynamics and neglect the faster ones. In this work, we propose two methodologies to enhance the performance of Neural ODEs. The first is based on the partition of unity and the proper generalized decomposition (PGD), where we propose a way to separate slow dynamics from faster ones. The second is based on the smart choice of augmentation functions to increase the dimension of the Neural ODE's vector field. Our results show that both proposed methodologies enhance the modeling capabilities of Neural ODEs over nonlinear time series with multiple dynamics.