(621h) Computing Subgradients of Bivariate Functions for Nonsmooth Optimization | AIChE

(621h) Computing Subgradients of Bivariate Functions for Nonsmooth Optimization

Authors 

Khan, K. - Presenter, McMaster University
Phenomena such as thermodynamic phase transitions, discrete mode switching, and closed-loop control actions may introduce nonsmoothness into otherwise smooth process models. Classical optimization methods developed for smooth models can fail entirely in the nonsmooth case, since there is no longer any guaranteed connection between gradients and descent directions. Instead, dedicated methods for nonsmooth optimization make use of subgradient and generalized derivative information to obtain useful local sensitivity information.

This presentation describes a new method for computing subgradients for functions of two variables under minimal assumptions, by making strategic use of directional derivatives, for use in overarching optimization methods. If the function is convex, then a subgradient is computed in the usual sense; if the function is nonconvex, then an element of Clarke’s generalized gradient [1] is computed instead. The two-variable requirement will be clarified through a counterexample; while this assumption is restrictive, the overall result is nevertheless useful and surprisingly versatile. Examples will be presented where no other existing method for subgradient computation applies. Applications will also be presented to subgradients of Tsoukalas-Mitsos relaxations [2] of functions of more than two variables, for use in deterministic methods for global optimization.

References

[1] F.H. Clarke, Optimization and Nonsmooth Analysis, SIAM, Philadelphia, PA, 1990.

[2] A. Tsoukalas and A. Mitsos, Multivariate McCormick relaxations, J. Glob. Optim., 59:633-662, 2014.