(15b) A Lagrangean Based Branch-and-Cut Technique for Global Optimization of Minlp Problems with Decomposable Structures

Authors: 
Karuppiah, R., Carnegie Mellon University
Grossmann, I. E., Carnegie Mellon University


Many relevant optimization problems in Process Systems Engineering are non-convex problems that involve non-linearities and/ or discrete variables (Adjiman et al. (1997), Lee and Grossmann (2001), Grossmann (2002), Tawarmalani and Sahinidis (2004)). Solving non-convex non-linear programs (NLPs) and mixed-integer non-linear programs (MINLPs) to global optimality are NP-hard problems (Tawarmalani and Sahinidis (2002)). Therefore, existing algorithms often exhibit exponential time requirements for rigorously finding the global optimum. This implies that exploiting problem structure is of utmost importance if these problems are to be solved with reasonable computational expense.

An important type of a large-scale non-convex problem is one where a number of non-convex models are combined into a single model such that the problem can be decomposed, for a fixed subset of variables, into several sub-problems. In particular, common problems with a decomposable structure arise in two-stage stochastic programming (Birge and Louveaux (1997)), which are commonly used for optimization under uncertainty. Usually, some form of branch and bound search is applied to carry out the global optimization of such non-convex problems. In such branch and bound schemes, bounds on the global optimum are obtained by solving relaxations that are constructed by convexifying the non-convex terms in the model. The major challenge then lies in developing tight relaxations that will in turn yield strong bounds on the global optimum, which helps in reducing the solution times of problems. This work is mostly aimed at addressing this issue.

We present a global optimization algorithm for solving a class of large-scale non-convex MINLPs that have a decomposable structure. We provide a generic formulation for a class of problems with decomposable structures that include binary variables and non-convex non-linear terms. Such models would entail independent sets of constraints which are linked to each other. On removing the links between these constraints, the problems can be decomposed into a number of smaller sub-problems. We then propose a specialized deterministic branch and cut algorithm to solve these models to global optimality, wherein bounds on the global optimum are obtained by solving convex relaxations of these models with cuts added to them. The cuts are derived using the global optima of the sub-problems obtained from decomposing the original non-convex model using Lagrangean decomposition. On adding these cuts to convex relaxations of the original MINLPs we obtain tight relaxations that result in strong bounds on the global optima of the original decomposable models, which help in accelerating the search for the solution within the branch and cut framework. The main idea in this work is to combine the concepts of Lagrangean decomposition and convex relaxations of non-convex models in order to generate tight bounds on the global optima of non-convex models.

We apply the proposed method to the optimal synthesis of an integrated water system operating under uncertain operational conditions that are described by discrete distribution functions. Here, water using processes and water treatment operations have to be combined into a single network such that the total cost of building the network and operating it optimally over different scenarios is globally minimized. We propose a superstructure whose optimization is formulated as a multiscenario non-convex MINLP problem that corresponds to a two-stage stochastic programming model that can grow rapidly in size with the number of scenarios. Several examples were solved using the proposed algorithm, including one with 24 binary variables, 764 continuous variables, 928 constraints and 406 non-convex terms, and it was found that the use of the proposed algorithm helps in reducing the solution time by several orders of magnitude when compared to exisitng methods.