ECCOMAS 2024

Unravelling Architectures of Continuous-Time Recurrent Neural Networks

  • Datar, Chinmay (Technical University of Münich)
  • Datar, Adwait (Institute for data science foundations (TUHH))
  • Dietrich, Felix (Technical University of Münich)
  • Schilders, Wil (Eindhoven University of Technology)

Please login to view abstract download link

Incorporating physics-based knowledge into artificial neural networks for simulating dynamical systems is essential to satisfy physical constraints, improve generalization, and reduce data requirements. In this talk, we discuss a physics-based approach to compute efficient architectures of continuous-time recurrent neural networks without the need for an expensive architecture search. As a first step towards constructing architectures for general dynamical systems, we focus on Linear Time-Invariant (LTI) systems. We use a variant of continuous-time recurrent neural networks in which the output of each neuron is a solution of an ordinary differential equation. Our approach is physics-based and gradient-free because we compute the architecture and parameters of the network directly from the state-space matrices. We provide an upper bound on the numerical error, building on the guarantees provided by the ODE solvers used to solve for states of individual neurons. Empirically, we demonstrate that our physics-based network can accurately simulate a two-dimensional convection-diffusion equation up to machine precision compared to the numerical solution.