(282d) A Multiparametric Programming Approach to Solving Neural Network-Based Optimization Problems with Application to Control | AIChE

(282d) A Multiparametric Programming Approach to Solving Neural Network-Based Optimization Problems with Application to Control

Authors 

Pistikopoulos, E., Texas A&M Energy Institute, Texas A&M University
Kakodkar, R., Texas A&M University
Recently, it has been shown that rectified linear unit (ReLU) based neural networks (NN) are mixed integer linear representable and, therefore, can be incorporated into mixed integer programming frameworks. This has led to many applications, including detecting adversarial inputs to a ReLU NN classifier [1], explicit model predictive control [2], and optimization of oil production [3]. Many contributions to the literature have focused on the effective tightening of ReLU mixed integer representations, thus making the mixed integer program more tractable to solve [1, 4, 5].

In this presentation, we propose a novel tightening procedure based on a multiparametric programming reformulation of the corresponding ReLU reformulated optimization problem. The bounding procedure features 1) generating valid tight bounds on the individual auxiliary variables introduced from the ReLU NN reformulation and 2) generating bounds on the binary variables relating to each layer of the ReLU reformulation. The tightened bounds are valid for all realizations of parameters and, thus, are valid for any realization of the problem and would only happen once and offline. As this bounds-tightening procedure is applied offline, more computationally expensive methods for formulation tightening can be applicable than in the case where tightening occurs purely online. Thus larger computational benefits can be observed than in the online case. We demonstrate the effectiveness of this method in a case study of model predictive control of a nonlinear chemostat where the dynamics are approximated via a ReLU NN.

[1] Calvin Tsay, Jan Kronqvist, Alexander Thebelt, and Ruth Misener. Partition-based formulations for mixed-integer optimization of trained relu neural networks. Advances in Neural Information Processing Systems, 34:3068–3080, 2021.
[2] Justin Katz, Iosif Pappas, Styliani Avraamidou, and Efstratios N Pistikopoulos. Integrating deep learning models and multiparametric programming. Computers & Chemical Engineering, 136:106801, 2020.
[3] Bjarne Grimstad and Henrik Andersson. Relu networks as surrogate models in mixed-integer linear programs. Computers & Chemical Engineering, 131:106580, 2019.
[4] Ross Anderson, Joey Huchette, Will Ma, Christian Tjandraatmadja, and Juan Pablo Vielma. Strong mixed-integer programming formulations for trained neural networks. Mathematical Programming, 183(1):3–39, 2020.
[5] Christian Tjandraatmadja, Ross Anderson, Joey Huchette, Will Ma, Krunal Kishor Patel, and Juan Pablo Vielma. The convex relaxation barrier, revisited: Tightened single-neuron relaxations for neural network verification. Advances in Neural Information Processing Systems, 33:21675–21686, 2020.