(461f) Optimization of Black-Box Problems Using Smolyak Grids and Polynomial Approximations

A new black-box surrogate-based optimization method is proposed, which aims to converge to the global optimum of box-constrained problems using only input-output data. Optimization methods that are based on the exchange of designed data-streams between the optimization solver and the data-generating system (simulation or experiment) have gained significant interest recently [1-3]. The number of applications of data-dependent optimization methods is already large, spanning a wide range of areas such as mechanical and structural design, complex flowsheet optimization and process synthesis, geosciences, parameter estimation of large ODE systems, and many more. Several challenges still exist in terms of the computational tractability of the methods as the problem size increases, especially since the cost of the simulation/experiment limits the amount of data that can be exchanged. Moreover, due to the lack of algebraic forms and derivatives, convergence to optimality reliably and in a finite number of iterations is hard to achieve for a wide range of problems. As a result there is a high interest for black-box optimization methods that locate better solutions by globally searching the feasible space and converging to optimal solutions with improved convergence rates and a limited number of samples.

The proposed work tests the performance of a new surrogate-based optimization algorithm that is based on Smolyak-grid sampling and polynomial interpolation, which have been predominantly used for numerical integration [4]. The algorithm starts with a global search of the multidimensional space, using a Smolyak grid [5] which is constructed using Chebyshev extrema in the one-dimensional space. The collected samples are used to fit polynomial interpolants, which are used as surrogates towards the search for the optimum. The developed algorithm adaptively refines the grid by collecting new points in promising regions, and iteratively refines the search space around the incumbent sample until the search domain reaches a minimum size and convergence has been attained. The algorithm is tested on a large set of benchmark problems and its performance is compared to a recent algorithm for global optimization of grey-box problems using quadratic, kriging and radial basis functions [6]. It is shown that the proposed algorithm has a consistently reliable performance for the majority of test problems, which does not depend on the initialization of the sampling set. It will be shown that this reliability in performance is attributed to the use of Chebyshev-based sparse grids, which have been shown to approximate complex functions with low error bounds.

This work is dedicated to the authors' mentor and postdoctoral advisor, Christodoulos Floudas, who introduced the authors to this area.


1. Conn, A.R., K. Scheinberg, and L.N. Vicente, Introduction to derivative-free optimization. 2009, Philadelphia: SIAM.

2. Boukouvala, F., R. Misener, and C.A. Floudas, Global optimization advances in Mixed-Integer Nonlinear Programming, MINLP, and Constrained Derivative-Free Optimization, CDFO. European Journal of Operational Research, 2016. 252(3): p. 701-727

3. Rios, L.M. and N.V. Sahinidis, Derivative-free optimization: a review of algorithms and comparison of software implementations. Journal of Global Optimization, 2013. 56(3): p. 1247-1293

4. Gerstner, T. and M. Griebel, Numerical integration using sparse grids. Numerical Algorithms, 1998. 18(3-4): p. 209-232.

5. Smolyak, S.A. Quadrature and interpolation formulas for tensor products of certain classes of functions. in Dokl. Akad. Nauk SSSR. 1963.

6. Boukouvala, F. and C.A. Floudas, ARGONAUT: AlgoRithms for Global Optimization of coNstrAined grey-box compUTational problems. Optimization Letters, 2016: p. 1-19.