(575f) Optimization Algorithms for Dynamic Latent Variable Problems

Authors: 
Shin, S., University of Wisconsin-Madison
Smith, A., University of Wisconsin-Madison
Qin, J., University of Southern California
Zavala, V. M., University of Wisconsin-Madison
Analysis of multivariate time-series data is key for tasks such as process monitoring, fault detection, control, and data visualization and classification [1]-[6]. Dynamic latent variable (DLV) methods provide a powerful approach for the analysis of multivariate time-series data [3]. DLV methods assume the existence of latent variables that drive the dynamic behavior of the system and seek to extract such latent variables from the data by solving optimization problems. Several DLV analysis techniques have been reported in the literature, which include dynamic inner principal component analysis (DiPCA) [4] and dynamic inner canonical correlation analysis (DiCCA) [5,6]. These DLV techniques rely on the solution of challenging nonconvex nonlinear programming problems (NLP). Off-the-shelf NLP solvers such as Ipopt can solve such problems in a robust manner but are not computationally scalable [4]-[6]. This is because DLV methods give rise to dense linear algebra operations. Existing decomposition techniques for DLV problems, on the other hand, are highly scalable but their convergence properties are not well understood [7].

In this talk, we present a rigorous study on the convergence properties of decomposition strategies for DiPCA and DiCCA. We first show that existing decomposition algorithms are coordinate maximization schemes [8]. This observation enables us to obtain insights into their convergence properties and to propose improved algorithmic variants. Our analysis also provides insight on how data structure affects the conditioning of the problem and the convergence properties of the algorithms. We present extensive benchmark tests with experimental chemical sensor data to justify our developments.

References:
[1] Y. Cao, H. Yu, N. L. Abbott, and V. M. Zavala, “Machine learning algorithms for liquid crystal-based sensors,” ACS sensors, vol. 3, no. 11, pp. 2237–2245, 2018.
[2] G. E. Box, G. M. Jenkins, G. C. Reinsel, and G. M. Ljung, Time series analysis: forecasting and control. John Wiley & Sons, 2015.
[3] G. Li, S. J. Qin, and D. Zhou, “A new method of dynamic latent-variable modeling for process monitoring,” IEEE Transactions on Industrial Electronics, vol. 61, no. 11, pp. 6438–6445, 2014.
[4] Y. Dong and S. J. Qin, “A novel dynamic PCA algorithm for dynamic data modeling and process monitoring,” Journal of Process Control, vol. 67, pp. 1–11, 2018.
[5] Y. Dong and S. J. Qin, “Dynamic latent variable analytics for process operations and control,” Computers & Chemical Engineering, vol. 114, pp. 69–80, 2018.
[6] Y. Dong and S. J. Qin, “Regression on dynamic pls structures for supervised learning of dynamic data,” Journal of Process Control, vol. 68, pp. 64–72, 2018.
[7] S. Shin, A. D. Smith, S. J. Qin, and V. M. Zavala “On the Convergence of the Dynamic Inner PCA Algorithm,” Under Review, 2019.
[8] Y. Wang, J. Yang, W. Yin, and Y. Zhang, “A new alternating minimization algorithm for total variation image reconstruction,” SIAM Journal on Imaging Sciences, vol. 1, no. 3, pp. 248–272, 2008.