(674d) Development of a Hybrid First Principles – Artificial Intelligence Approach for Dynamic Modeling of Complex Systems
AIChE Annual Meeting
Thursday, November 19, 2020 - 8:45am to 9:00am
Various types of AI modeling techniques such as neural network (NN), deep learning, expert systems and fuzzy logic have been employed for process synthesis, design and modelling2. More recently, hybrid first-principle AI approaches have been proposed3 for reactive systems by using a linear dynamic model that is obtained through linearization of a non-linear dynamic model followed by a static neural network. That approach fails to account for the nonlinearity in the first-principles model and dynamics in the nonlinear AI model. Furthermore, neither the first-principles model nor the NN model is adaptive. If both the first-principles model and AI model are dynamic, nonlinear, and adaptive, it becomes considerably challenging to optimally synthesize such hybrid models with due consideration of complexity, computational expense, and accuracy.
A hybrid first principles-AI modeling technique has been developed where not only both the first principles and AI models are dynamic and nonlinear but they also interact with each other leading to a time-varying model. Instead of a series structure, we propose a hybrid series-parallel structure where, instead of propagation of information from the first- principles model through the AI model in the series structure mentioned above, the AI model learns the residual error from the first-principles model stemming from the unmodeled and/or unknown phenomena. The AI model is desired to be dynamic, nonlinear, and adaptive. While existing adaptive NN models such as adaptive bidirectional associative memory (ABAM)4 and transversal/recursive filters5,6 have found applications in signal processing and communication, they are inadequate for many chemical engineering applications where learning must be accomplished with data that may be noisy, limited, and non-informative. In this work, we propose a hybrid static-dynamic NN structure that can quickly learn to equilibrate to a minimum energy state. The developed gradient descent algorithm for learning can efficiently deal with blowing and vanishing gradients by developing a modified batch normalization approach. The algorithm also quantifies optimality gaps at every iteration reflecting the tradeoff between the computational expense and accuracy. For making the learning process computationally efficient for highly parameterized system, the algorithm down selects weights using a sensitivity-based approach.
The proposed algorithm is applied to the Van de Vusse reactor model as well as to a supercritical boiler system where complex dynamics associated with reactive-diffusive processes leading to oxide scale formation in the superheater tube banks coupled with mass and heat transfer makes it a challenging system. Our work shows the tradeoff between computational expense and accuracy. It is observed that the stability and speed of the learning algorithm are critical for success of the proposed algorithm for a real-life application.
- Zendehboudi, S., Rezaei, N. & Lohi, A. Applications of hybrid models in chemical, petroleum, and energy systems: A systematic review. Appl. Energy228, 2539â2566 (2018).
- Venkatasubramanian, V. The promise of artificial intelligence in chemical engineering: Is it here, finally? AIChE J.65, 466â478 (2019).
- Chen, L., Hontoir, Y., Huang, D., Zhang, J. & Morris, A. J. Combining first principles with black-box techniques for reaction systems. Control Eng. Pract.12, 819â826 (2004).
- Zupan, J. & Gasteiger, J. Neural networks: A new method for solving chemical problems or just a passing phase? Anal. Chim. Acta248, 1â30 (1991).
- Roussel-Ragot, P., Personnaz, L., Dreyfus, G. & Marcos, S. Communicated by Steven Nowlan Neural Networks and Nonlinear Adaptive Filtering: Unifying Concepts and New Algorithms 0. Nerrand. 199, 165â199 (1993).
- SchÃ¤dler, K. & Wysotzki, F. Comparing structures using a Hopfield-style neural network. Appl. Intell.11, 15â30 (1999).