(624e) Accelerating Multiscale Global Optimization through Reduced Bayesian Optimization | AIChE

(624e) Accelerating Multiscale Global Optimization through Reduced Bayesian Optimization

Authors 

Wichrowski, N. J. - Presenter, Johns Hopkins University
Psarellis, G., Johns Hopkins University
Bello-Rivas, J., Princeton University
Dietrich, F., Technical University of Munich
Hauenstein, J., University of Notre Dame
Kevrekidis, I. G., Princeton University
Optimization of multiscale potentials is pertinent to a variety of fields, including computational biology [1], fluid mechanics [2], and materials science [3]. Such multiscale objective functions change rapidly in some directions and slowly in others, so optimization algorithms typically exhibit behavior akin to singularly perturbed dynamical systems. The trajectory of iterates is quickly attracted onto a low-dimensional slow manifold, but further progress comes much more slowly.

We propose a method to accelerate the convergence of Bayesian optimization (BO) on these multiscale problems by exploiting the existence of an underlying low-dimensional manifold in a data-driven fashion. By using manifold learning techniques (e.g., diffusion maps), we construct an (approximate) on-the-fly parameterization of the slow manifold from the location history of our BO iterations in the “full space” of the original domain. Then, on the “reduced space” of the slow manifold, we continue optimizing in terms of the relatively few diffusion maps coordinates [4]. Previous work [5] used the reduced space to inform the choice of direction for a coarse step. Here, we allow for global optimization in the reduced space. Although at times we need to lift back to the full space in order to ensure that our approximation of the slow manifold remains valid, working preferentially in the reduced space limits the computational requirements of locating a global minimizer.

Furthermore, motivated by BO problems in which a black-box objective function includes simulators or solvers, we present a Bayesian Continuation (BCon) framework. We finally demonstrate how BCon can be coupled with (full or reduced space) Bayesian optimization to improve its speed and efficiency. Extensions of BCon to other problems are also briefly discussed.

[1] M. Alber, A. Buganza Tepole, W. R. Cannon, S. De, S. Dura-Bernal, K. Garikipati, G. Karniadakis, W. W. Lytton, P. Perdikaris, L. Petzold, and E. Kuhl. Integrating machine learning and multiscale modeling: Perspectives, challenges, and opportunities in the biological, biomedical, and behavioral sciences. NPJ Digital Medicine, 2(1):1–11, 2019.

[2] S. Sirisup, G. E. Karniadakis, and D. Xiu, and I. G. Kevrekidis. Equation-free/Galerkin-free POD-assisted computation of incompressible flows. Journal of Computational Physics, 207(2):568—587, 2005.

[3] M. F. Horstemeyer. Multiscale modeling: A review. In J. Leszczynski and M. K. Shukla, editors, Practical Aspects of Computational Chemistry: Methods, Concepts, and Applications, chapter 4, pages 87–135. Springer, Dordrecht, Netherlands, 2009.

[4] F. Dietrich, J. M. Bello-Rivas, and I. G. Kevrekidis. On the Correspondence between Gaussian Processes and Geometric Harmonics. arXiv preprint arXiv:2110.02296, 2021.

[5] D. Pozharskiy, N. J. Wichrowski, A. B. Duncan, G. A. Pavliotis, and I. G. Kevrekidis. Manifold learning for accelerating coarse-grained optimization. Journal of Computational Dynamics, 7(2):511—536, 2020.