(587a) Schuripopt: A Parallel Optimization Package for Structured Nonlinear-Programming Problems

Authors: 
Rodriguez, J. S., Purdue University
Hackebeil, G., Oregon State University
Laird, C., Purdue University
Optimization plays an important role in a variety of areas, including process design, operation, and control. Large-scale nonlinear optimization problems that commonly arise in these areas need to be solved efficiently. These problems are characterized by having a large number of equations and variables and can become computationally prohibited. However, due to the nature of the problems, these problems often have a structure that can be exploited using parallel computing capabilities offered by modern computers. In this work, we developed SchurIpopt, a parallel extension to Ipopt that solves structured NLP problems efficiently with shared memory and distributed memory parallel architectures.

Interior-point methods have proven to be effective for solving large-scale nonlinear programming problems. The dominant computational steps in an interior-point algorithm are the solution of the KKT system and the computation of NLP functions and derivatives in every interior-point iteration. Our implementation uses a Schur-complement decomposition strategy to exploit the structure of the NLP problems that come from multi-scenario and dynamic optimization applications. In both cases, the inherited structure can be exploited by decomposing the problems to overcome computing memory and time limitations that commonly arise when dealing with large-scale problems. To achieve high parallel efficiencies, the implementation not only focuses on parallelizing the solution of the KKT system, but it also parallelizes function evaluations and scale-dependent operations like vector-vector and matrix-vector operations. This algorithm has been interfaced with PySP, an extension of Pyomo for modeling stochastic programming problems. To Illustrate the performance of the implementation we present two case studies in stochastic programming and dynamic optimization.