(373aa) Comparison of Global, Stochastic Optimization Algorithms Using Toy Problems and Multi-Parameter Models of Kinetic Fermentation, and Rheological Data
A classical approach to fitting parameters has been the minimization of the sum of least squares, also known as the weighted residuals least-squares technique, using a local, gradient method. There are several variations of this basic strategy currently in use. Some examples are the general Newtonâs method, trust-region methods, line search methods, and Levenberg-Marquardt methods. Alternatively, stochastic methods have been developed in order to better explore the available parameter space in search of a global minimum and/ or alleviate the dependence of the solution on the initial guess. These methods include simultaneous perturbation stochastic approximation methods (SPSA), singular simulated annealing, particle swarm (PSO) and genetic algorithms (GA) [1,3].
The approach developed in the present work incorporates a hybrid parallel, simulated annealing, âMetropolisâ âlike algorithm, within a least squares fitting algorithm recently developed by Armstrong et al. . The parallel simulated annealing is used to develop a good initial guess of the values of the parameters space using as an objective function the squares of the differences of the model predictions from oscillatory in time experimental data obtained over a period. The selection of the initial guess is followed by an application of a modified least squares local minimization procedure to determine accurately the âglobalâ minimum. An important advantage of the proposed method is that all parameters needed to execute it are evaluated numerically based on few simulated annealing runs. A further advantage of the proposed algorithm is the potential speedup by executing its most time-consuming step, a series of simulated annealing simulations, in parallel where possible.
The parallel simulated annealing algorithm proposed in the present work attempts to surpass the shortcomings of previous stochastic algorithms by more systematically exploring the parameter space without the need of complicated adjustable parameters. The parallel simulated annealing algorithm employs features of the more traditional simulated annealing âMonte-Carloâ energy minimization approach, with the pseudo-energy to be minimized is derived from the least squares error function constructed between the model predictions and available data. In contrast to the standard simulated annealing process, the parallel simulated annealing procedure that is proposed to be followed here involves utilization of a series of stochastic simulated annealing runs each one built around its own Boltzmann energy level. However, the parallel simulated annealing sequences are not completely independent. From time to time, there is the possibility offered for a parameter information exchange between ânearest neighborâ Boltzmann energy levels. When this exchange is conducted appropriately it allows for a maximum efficiency search for the optimum parameter values in a wide parameter space, without requiring finely tuned parameters or a sensitive dependence to the initial guess [1, 4, 5, 6, 7].
 Armstrong et al. âAn Adaptive Parallel Tempering Method for the Dynamic Data-Driven Parameter Estimation of nonlinear Models.â AIChE J. (2016).
 Battles and Trefethen âAn extention of MATLAB to continuous functions and operators.â SIAM J Sci Com. (2005).
 Armstrong. PhD Thesis, University of Delaware (2015).
 Spall. âAdaptive Stochastic Approximation by the Simultaneous Perturbation (SPSA) Method.â IEEE Transactions of Automatic Control (2000).
 Spall, J.C., Introduction to Stochastic Search Optimization. Wiley Interscience. 2003, Hoboken, New Jersey: John Wiley and Sons Inc.
 Earl, D.J., M.W. Deem, Parallel Tempering: Theory, applications and new perspectives. Phys. Chem. Chem, Phys., 2005. 7: p. 3910-3916.
 Amar, J.G., The Monte Carlo Method in Science and Engineering. Institution of Electrical Engineering Computer Science and American Institute of Physics, 2006. MAR/ APR p. 9 - 19.