(40h) EAGO.jl: Next Generation Global & Robust Optimization in Julia, Revisited | AIChE

(40h) EAGO.jl: Next Generation Global & Robust Optimization in Julia, Revisited

Authors 

Wilhelm, M. - Presenter, University of Connecticut
Stuber, M., University of Connecticut
Gottlieb, R., University of Connecticut
Mathematical optimization represents a key, if not defining, tool for the modern process-systems engineer’s toolkit. Nonconvex and weakly regular forms pervade modern chemical process simulations and have led to the development of a number of specialized solvers [1, 2, 3, 4]. In order to better address the wide variety of problems encountered in this area, we introduced EAGO.jl [5], a customizable and performant open-source global optimizer in Julia. EAGO.jl’s standard optimizer makes use of recent advances relating to reduced-space McCormick relaxations [6 – 11] to solve optimization problems to a certificate of global optimality. All major subroutines in EAGO.jl can be readily extended for specialized use cases. Since EAGO.jl was introduced, both the Julia language [12] as well as its optimization [13, 14] and machine learning communities [15, 16] have entered a more stable stage of maturity. In this talk, we’ll discuss recent improvements made to EAGO.jl’s backend as well as its API in the context of these new developments.

An improved ability to support extensive user customization is provided through EAGO.jl’s new modular directed-acyclic multigraph (DAMG) backend. A discussion of how this allows for user-defined composite relaxations and gradient information calculation is included. More importantly, this backend standardizes a number of tree-walking features present in EAGO.jl, including: common subexpression elimination, automatic differentiation [17], disciplined convex programming [18], forward-reverse propagation of relaxations, and set-valued enclosures of intermediate subexpressions [9, 19, 20]. This is coupled with novel functionality in the broader EAGO.jl package allowing for simplified detection and reformulation of hidden conic forms. Moreover, the unified DAMG representation naturally allows for the combined representation of both a reduced-space model and auxiliary variable representation. We exploit this dual representation to develop preliminary results that address a central question surrounding factorable programming approaches: “When might auxiliary variables serve to improve solver performance and reliability?” [6, 11]. In addition to this key revision in EAGO.jl’s backend, several improvements have been made to support specialized model forms.

EAGO.jl now extends the JuMP modeling language to allow for seamless support for solving semi-infinite programs and models with multi-input multi-output subexpressions. This has allowed users to address embedded hybrid neural network models [19] as well as dynamical systems. We discuss our recent theoretic advances related to constructing relaxations of both embedded hybrid models [19] and dynamical systems, which have recently been incorporated into EAGO.jl’s core functionality. We conclude by illustrating EAGO.jl’s flexibility and performance on case studies from the literature.

REFERENCES

[1] Horst, R. and Tuy, H., A. Global Optimization: Deterministic Approaches. Springer Berlin Heidelberg. 2013.

[2] Floudas, Christodoulos A., and Chrysanthos E. Gounaris. "A review of recent advances in global optimization." Journal of Global Optimization 45.1 (2009): 3-38

[3] Tawarmalani, Mohit, and Nikolaos V. Sahinidis. "A polyhedral branch-and-cut approach to global optimization." Mathematical programming 103.2 (2005): 225-249.

[4] Misener, Ruth, and Christodoulos A. Floudas. "ANTIGONE: algorithms for continuous/integer global optimization of nonlinear equations." Journal of Global Optimization 59.2-3 (2014): 503-526.

[5] Matthew E. Wilhelm and Matthew D. Stuber. "EAGO.jl: easy advanced global optimization in Julia. " Optimization Methods and Software, 1–26, 2020.

[6] Mitsos, A., Chachuat, B., and Barton, P.I. "McCormick-Based Relaxations of Algorithms. " SIAM Journal on Optimization. 20(2): 573-601.

[7] Scott, J.K., Stuber, M.D., and Barton, P.I. "Generalized McCormick Relaxations. " Journal of Global Optimization, 51:569-606, 2011.

[8] Kamil A. Khan, Harry AJ Watson, and Paul I. Barton. "Differentiable McCormick relaxations." Journal of Global Optimization 67.4 (2017): 687-729.

[9] Wechsung, Achim, Joseph K. Scott, Harry AJ Watson, and Paul I. Barton. "Reverse propagation of McCormick relaxations." Journal of Global Optimization 63.1 (2015): 1-36.

[10] Alexander Tsoukalas, Alexander Mitsos. "Tighter McCormick relaxations through subgradient propagation. "Journal of Global Optimization, 59: 633-662, 2014.

[11] Jaromił Najman, Dominik Bongartz, Alexander Mitsos. "Linearization of McCormick relaxations and hybridization with the auxiliary variable method. " Journal of Global Optimization, 1-26, 2021.

[12] Jeff Bezanson, Alan Edelman, Stefan Karpinski, and Viral B. Shah. "Julia: A fresh approach to numerical computing." SIAM review 59.1 (2017): 65-98.

[13] Dunning, Iain, Joey Huchette, and Miles Lubin. "JuMP: A modeling language for mathematical optimization." SIAM review 59.2 (2017): 295-320.

[14] Benoit Legat, Oscar Dowson, Joaquim Garcia, and Miles Lubin. “MathOptInterface: a data structure for mathematical optimization problems”. INFORMS Journal on Computing. In Press.

[15] Mike Innes. "Flux: Elegant machine learning with Julia." Journal of Open-Source Software 3.25 (2018): 602.

[16] Rackauckas, Christopher, et al. "Universal differential equations for scientific machine learning." arXiv preprint arXiv:2001.04385 (2020).

[17] Khan, Kamil A., and Paul I. Barton. "A vector forward mode of automatic differentiation for generalized derivative evaluation." Optimization Methods and Software 30.6 (2015): 1185-1212.

[18] Michael Grant Stephen Boyd, and Yinyu Ye. "Disciplined convex programming." Global optimization. Springer, Boston, MA, 2006. 155-210.

[19] Matthew E. Wilhelm, Chenyu Wang, and Matthew. D. Stuber. “Robust Optimization with Hybrid First-Principles Data-Driven Models.” Under review.

[20] Schichl, Hermann, and Arnold Neumaier. "Interval analysis on directed acyclic graphs for global optimization." Journal of Global Optimization 33.4 (2005): 541-562.

[21] Jaromił Najman, Alexander Mitsos. Tighter McCormick relaxations through subgradient propagation. Journal of Global Optimization. Journal of Global Optimization, 75(3): 565-593, 2019.