(234g) A Novel Surrogate-Based Derivative-Free Optimization Algorithm
AIChE Annual Meeting
2022
2022 Annual Meeting
Computing and Systems Technology Division
Advances in Nonlinear and Surrogate Optimization
Tuesday, November 15, 2022 - 9:48am to 10:06am
There has been considerable progress made in developing approaches for derivative-free optimization by using surrogate models to approximate any explicitly unknown relationships in the systems of interest (Boukouvala et al., 2016; Rios & Sahinidis, 2013). The surrogate approximations aim to guide the search towards the optimum of the original model. However, these algorithms can be difficult to scale to problems with high dimensionality (Bhosekar & Ierapetritou, 2018; Qian et al., 2016). To address this dimensionality issue, we have developed a two-tiered approach for surrogate-based optimization. The first stage aims to shrink the search space through a global search with random forests, and the second stage aims to locate the optimum via a local search within the reduced space.
In its initial stage, the algorithm reduces the size of the search space by solving a series of global deterministic subproblems using random forest (RF) models (Breiman, 2001). For each subproblem, an RF model is constructed to approximate the original objective function and is iteratively updated with new sample points based on an adaptive sampling method, Optimization Directed INcremental (ODIN) sampling. ODIN adds new sample points based on a space-filling criterion and by exploiting the regions around the previously identified best solution (Smith, 2015). Previous work has demonstrated that RF models are particularly adept at identifying optimal decision variable values, even for problems with a high number of input dimensions (Williams & Cremaschi, 2021). Formulation of the trained RF model into an optimization problem results in a mixed integer linear problem (MILP). RF models consist of a collection of decision trees, with the final output for the model being the average of the predicted output for each tree in the forest (Breiman, 2001). At each iteration of the algorithm, the search space is reduced by solving the resulting RF MILP and applying the thresholds given by the decision tree rules that lead to the optimal solution as new, reduced bounds on the decision variables.
Once a suitable reduction of the search space is achieved from the RF modeling stage, the next step in the DFO algorithm is to refine that solution with a local search in the now reduced search space. Candidate surrogate models for this second stage include Multivariate Adaptive Regression Splines (MARS) (Friedman, 1991) and Automated Learning of Algebraic Models for Optimization (ALAMO) (Cozad et al., 2014) models. Similar to the first stage, a surrogate model is constructed and iteratively improved with adaptive sampling. Based on the final form of the trained surrogate model, a deterministic optimization problem is formulated and solved to refine the solution. MARS and ALAMO models result in MINLP and NLP optimization models, respectively. This presentation will introduce the details of the algorithm, discuss the selection of the appropriate surrogate modeling technique for the second stage of the algorithm, as well as for the selection of the adaptive sampling method at each stage of the algorithm. The algorithm is compared to several commonly used derivative-free optimization algorithms, including NOMAD and SNOBFIT, on a suite of optimization test functions, and these results will also be presented.
References:
Bhosekar, A., & Ierapetritou, M. (2018). Space mapping based derivative-free optimization framework for supply chain optimization. Computer Aided Chemical Engineering, 44, 985-990.
Boukouvala, F., Misener, R., & Floudas, C. A. (2016). Global optimization advances in Mixed-Integer Nonlinear Programming, MINLP, and Constrained Derivative-Free Optimization, CDFO. European Journal of Operational Research, 252, 701-727.
Breiman, L. (2001). Random forests. Machine Learning, 45, 5-32.
Conn, A., Scheinberg, K., & Vicente, L. (2009). Introduction to Derivative-Free Optimization (Vol. 8). Philadelphia, PA, USA: SIAM.
Cozad, A., Sahinidis, N. V., & Miller, D. C. (2014). Learning surrogate models for simulation-based optimization. Aiche Journal, 60, 2211-2227.
Forrester, A., Sobester, A., & Keane, A. (2008). Engineering Design via Surrogate Modeling - A Practical Guide. Chichester: Wiley.
Friedman, J. H. (1991). Multivariate Adaptive Regression Splines - Rejoinder. Annals of Statistics, 19, 123-141.
Qian, H., Hu, Y., & Yu, Y. (2016). Derivative-free optimization of high-dimensional non-convex functions by sequential random embeddings. In G. Brewka (Ed.), Twenty-Fifth International Joint Conference on Artificial Intelligence (IJCAI '16) (pp. 1946-1952). New York, NY: AAAI Press.
Rios, L. M., & Sahinidis, N. V. (2013). Derivative-free optimization: a review of algorithms and comparison of software implementations. Journal of Global Optimization, 56, 1247-1293.
Smith, J. (2015). Computationally Assisted Biofuel Production: Hydrodynamics, Optimization, and Heuristics. University of Tulsa, Tulsa, OK.
Williams, B., & Cremaschi, S. (2021). Selection of surrogate modeling techniques for surface approximation and surrogate-based optimization. Chemical Engineering Research & Design, 170, 76-89.