Physics-informed Neural Networks (PINNs) have emerged as a competitive approach to solving forward and inverse differential equation problems across scientific and engineering domains. Several works have shown how PINNs may be advantageous for solving Partial Differential Equation systems (PDE) compared to traditional methods, such as compatibility with high performance computing architectures, combining of data and mechanistic equations, and high flexibility with problem geometries . However, a trained PINN only represents a PDE solution for a specific realization of boundary and initial conditions. If these change, the PINN must be retrained, and this is computationally expensive and limits applications of PINNs that may require faster predictions. In this work, we propose applying the universal approximation theorem to this modeling paradigm in order to train generalized PINNs that can accurately predict solutions when given unseen boundary and/or initial conditions. While a-priori training costs and model complexity increase, generalized PINNs must only be trained once and the resultant model can provide fast approximations to PDE solutions over the full problem domain. We will that this approach can enable PINNs to be employed in time-sensitive optimization problems, such as model predictive control and PDE constrained optimization, and compare our approach with conventional techniques in terms of both accuracy and computational cost.
 Maziar Raissi, Paris Perdikaris, and George E Karniadakis. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational Physics, 378:686-707, 2019.
 Shengze Cai, Zhiping Mao, Zhicheng Wang, Minglang Yin and George E Karniadakis. Physics-informed neural networks for fluid mechanics: A review. Acta Mechanica Sinica, pg 1-12, 2022.