(298b) Enabling Interpolation of Sparse Data Via Neural ODEs | AIChE

(298b) Enabling Interpolation of Sparse Data Via Neural ODEs

Authors 

Boukouvala, F., Georgia Institute of Technology
Volkovinsky, R., Georgia Institute of Technology
Developing an interpretable (i.e. mechanistic) predictive model for a dynamic response is at the heart of model-based process development and control. However, model-building can be a time-intensive process. Increased complexity of a process system invariably requires increased complexity of the associated model, requiring the modeler to test multiple model structures each of which may require long-computation times to simulate, estimate their parameters and optimize. For the case of differential equations (DEs), several authors have proposed an indirect approach to model-fitting that accelerates regression by avoiding repeated numerical integration of the mechanistic DE model [1-3]. The accuracy of this indirect approach, however, is highly dependent on the accuracy of inferring derivative information from measured data.

Key to accurate estimation of system derivatives is accurate interpolation of measured data, which may be spread across multiple experiments. Recently, we proposed Neural ODEs as the data-driven means to interpolate state data for parameter estimation, offering evidence that Neural ODEs could better infer state derivatives than algebraic data-driven models [4]. However, yet to be convincingly demonstrated is whether Neural ODEs can (and under what circumstances) interpolate data better than all standard interpolation techniques. This presentation intends to address this gap. Through a series of case studies, we show how the ability of Neural ODEs to transfer learning across experiments gives it a global interpolation property, allowing it to interpolate datasets outside the reach of standard interpolation techniques, which rely on local interpolation. To demonstrate the framework’s generalizability, this presentation will clearly map out the criteria necessary for robust interpolation of sparse datasets via Neural ODEs. Finally, as derivative estimation is generally not an end in itself, we conclude by demonstrating how the accurate estimation of derivatives under sparse data conditions enables automated kinetic model identification. Thus, in addition to highlighting the flexible nature of Neural ODEs, the presentation aims to present a vision of the expansive set of problems these universal interpolators are well-positioned to address.

References

1) Kahrs, O., et al., Incremental Identification of Hybrid Models of Dynamic Process Systems, in Model-Based Control: Bridging Rigorous Theory and Advanced Technology, P.M.J. Hof, C. Scherer, and P.S.C. Heuberger, Editors. 2009, Springer US: Boston, MA. p. 185-202.

2) Varah, J.M., A Spline Least Squares Method for Numerical Parameter Estimation in Differential Equations. SIAM Journal on Scientific and Statistical Computing, 1982. 3(1): p. 28-46.

3) Mehrkanoon, S., S. Mehrkanoon, and J.A.K. Suykens, Parameter estimation of delay differential equations: An integration-free LS-SVM approach. Communications in Nonlinear Science and Numerical Simulation, 2014. 19(4): p. 830-841.

4) Bradley, W. and F. Boukouvala, Two-Stage Approach to Parameter Estimation of Differential Equations Using Neural ODEs. Industrial & Engineering Chemistry Research, 2021.