(51e) Integrated Technical Computing for Process Engineering (poster) | AIChE

(51e) Integrated Technical Computing for Process Engineering (poster)

Authors 

Process engineering leverages data, domain knowledge, and modeling to design, scale, and continuously improve plants, from R&D to pilot to production. The workflow fundamentally involves data analysis and visualization, model development, and process optimization.

Data analysis starts with preprocessing of the dataset, to impute missing values, correct errors, and often impart structure. Descriptive analysis is used to characterize what has been observed, for each individual quantity, as well as compute correlations between quantities. Visualization is used to rapidly develop understanding regarding the relationships between predictors and responses.

Insights gained from data analysis are subsequently leveraged to create a model, to describe the responses as mathematical functions of the predictors. In simple cases, first principles understanding is confirmed, and a governing equation can be written. In cases of intermediate complexity, the basic mechanisms for observed behavior are described by a theoretical framework, and parameter estimation or process identification methods can be used. Complex cases are typified by a multiplicity of superposed physical effects and absence of an economic functional form —in these cases, machine learning techniques are required to infer input-output relationships.

Once the model has been created, predictions can be made about the responses that would be recorded if an experiment based on the predictor values was conducted. Since the model enables the parameter space to be explored continuously, optimization methods can be employed to find the combination of predictors that minimizes a specified function of the responses. The optimal conditions so determined inform the selection of the plant operating setpoint. Sensitivity studies are conducted to define the plant operating region around the setpoint, and thus specify the control system requirements.

There are numerous software tools available which address select subsets of the process engineering workflow. MATLAB is one of very few that cover it end-to-end, and is distinguished in that it also offers utilities for distributed computing, big data processing, automatic code generation, and real-time interfacing with other software tools —including process simulators.

In this talk we describe MATLAB functionality relevant to each stage of the process engineering workflow, and provide application examples and graphics to bring the concepts to life. Specific topics will include data analysis (tall objects, logical indexing, statistical plotting), model development (principal components, system identification, neural networks), and optimization (linear programming, pattern search, and genetic algorithms).