(200a) Using Convex Nonlinear Relaxations In the Global Optimization of Nonconvex Generalized Disjunctive Programs | AIChE

(200a) Using Convex Nonlinear Relaxations In the Global Optimization of Nonconvex Generalized Disjunctive Programs

Authors 

Ruiz, J. P. - Presenter, Carnegie Mellon University
Grossmann, I. - Presenter, Carnegie Mellon University


Many problems in engineering can be represented by a set of algebraic expressions using continuous and discrete variables, leading to a Mixed-integer Nonlinear Program (MINLP). In order to represent accurately the behavior of physical, chemical, biological, financial or social systems, nonlinear expressions are often used. In general, this leads to an MINLP where the solution space is nonconvex and hence difficult to solve since this may give rise to local solutions. In the last decade, many global optimization algorithms for nonconvex problems have been proposed [1,2]. The efficiency of these methods heavily relies on having tight relaxations and that is why many of the contributions in this area have been related to this subject. However, in general, finding the global optimum of large-scale nonconvex MINLP models in reasonable computational time remains a largely unsolved problem. 

 Raman and Grossmann [4] presented an alternative way to represent discrete/continuous optimization problems by using not only algebraic expressions but also disjunctions and logic propositions giving rise to Generalized Disjunctive Programming (GDP), which can be regarded as an extension of Disjunctive Programming [5]. This higher abstraction level representation not only facilitates the modeling, but also keeps the underlying logical structure of the problem, which can be exploited to find stronger relaxations and hence, develop more efficient solution methods. With this in mind, we have recently presented a method to solve nonconvex Generalized Disjunctive Programs that involve bilinear and concave functions [6]. The main idea behind this methodology to find strong relaxations relies on a two step procedure. In the first step the nonconvex functions are replaced with polyhedral relaxations leading to a Linear GDP. In the second step the results of Sawaya and Grossmann [7] are used. These authors showed that a stronger relaxation of Linear GDPs can be obtained by the application of basic steps, a process that consists of intersecting disjunctions to obtain equivalent disjunctive sets whose hull relaxation is tighter. Even though the method we presented showed significant improvements when compared to traditional approaches, the efficiency of the method depends on the strength of the polyhedral relaxations of the nonconvex functions. 

In this paper we address the global optimization of GDP problems that in addition to bilinear and concave terms, involve other terms such as linear fractional for which nonlinear convex relaxations have shown to provide rigorous convex envelopes that are much tighter than linear relaxations. The use of nonlinear convex relaxations leads to a nonlinear convex GDP which relaxation can be strengthen by using recently results from the work of Ruiz and Grossmann [8].  

We first define the general nonconvex GDP problem that we aim at solving and review the traditional method to find relaxations. Second, we show how we can strengthen the relaxation of the traditional approach by presenting a systematic procedure to generate a hierarchy of relaxations based on the application of basic steps to nonlinear disjunctive convex sets. We outline a set of rules that avoids the exponential transformation to the Disjunctive Normal Form leading to a more efficient implementation of the method. Finally we assess the performance of the method by solving to global optimality several instances of process network, reactor network and heat exchanger network problems. It is shown that the proposed strong nonlinear relaxations lead to stronger lower bound predictions and a decrease in the required number of nodes when used within an NLP-based spatial branch and bound framework, as a result requiring reduced computational times.        

REFERENCES

[1] Floudas, C. A., Deterministic global optimization: Theory methods and applications., Dordrecht, The Netherlands: Kluwer Academic Publishers, 2000.

[2] Tawarmalani, M. and Sahinidis, N., Convexification and Global Optimization in Continuous and Mixed-Integer Nonlinear Programming., Kluwer Academic Publishers , 2002.

[3] Horst, R. and Tuy, H., Global Optimization deterministic approaches (3rd Ed), Berlin: Springer-Verlag, 1996.

[4] Raman R. and Grossmann I.E., Modelling and Computational Techniques for Logic-Based Integer Programming, Computers and Chemical Engineering, 18, 563, 1994.

[5] Balas E., Disjunctive Programming, Annals of Discrete Mathematics, 5, 3-51, 1979.

[6] Ruiz, J.P. and Grossmann, I.E., Strengthening lower bounds for the global optimization of non-convex Generalized Disjunctive Programs, Computers and Chemical Engineering, 34:6, 914-930, 2010.

[7] Sawaya N., Thesis: Reformulations, relaxations and cutting planes for generalized disjunctive programming, Carnegie Mellon University, 2006.

[8] Ruiz, J.P. and Grossmann, I.E., A hierarchy of relaxations for convex Generalized Disjunctive Programs, Submitted for publication.