(333d) An Optimization-Based Approach for Learning Simple Parametric Surrogate Models
In this paper, we present a systematic computational study of several fitness metrics that can be used in an optimization-based subset selection methodology to identify an optimal subset of regression variables. These metrics include Mallowsâ?? Cp, Akaikeâ??s information criterion, and Bayesian information criterion amongst others. The resulting models consist of a linear combination of nonlinear transformations of input variables, and their simple algebraic form can help provide insight on the system at hand. We complement these exact optimization algorithms with fast heuristics and describe their computational performance in ALAMO. Moreover, we present a systematic comparison between ALAMOâ??s optimization-based approach to model fitting from data with a number of other parametric model building methods, including the lasso implementation in Matlab  and Râ??s leaps routine .
Â Cozad, A., N. V. Sahinidis, and D. C. Miller, Automatic learning of algebraic models for optimization, AIChE Journal,Â 60, 2211-2227, 2014.