(356c) Population Balance Modeling for Twinscrew Granulation: Insight in the Notion of Distance between Distributions and the Need for Identifiable Models for Good Predictive Power | AIChE

# (356c) Population Balance Modeling for Twinscrew Granulation: Insight in the Notion of Distance between Distributions and the Need for Identifiable Models for Good Predictive Power

Conference

Year

Proceeding

Group

Session

Time

## Authors

Ghent University
Ghent University
Ghent University
Traditional pharmaceutical solid oral dosage processes comprise of a series of batch unit operations. More recently, a transition is developing from batch unit processes to continuous manufacturing application, to cope with the inefficiencies and high cost involved in process development & scale up. Twin-screw wet granulation is a developing pharmaceutical continuous process that is being assessed for its performance in solid dosage manufacturing. In this research, the twin-screw wet granulation unit is a unit operation of the ConsiGmaTM-25 continuous powder-to-tablet process line from GEA Pharma Systems. However, since these continuous processes are still under development in the pharmaceutical industry, detailed process knowledge and understanding is still evolving. Application of mechanistic models can help in filling knowledge gaps by thoroughly investigating gathered detailed experimental data and unravelling the underlying mechanisms.

Previous work of the authors focussed on population balance modelling of the continuous pharmaceutical twin-screw wet granulation unit [1]. In that work, a novel two compartmental population balance model was calibrated and validated using measurements of particle size after the wetting zone and at the end of the granulator.

To further develop the practical applicability of this model, some more insight is needed. From an application perspective, the model parameters need to be linked with the process conditions. To ensure that this linkage is sound, the obtained parameter set needs to be unique. In other words, every single simulated distribution can only be simulated using one parameter set. In mathematical terms, it is said that the function is injective. This is in fact the basis of the identifiability concept, being viewed as a parameter estimation problem. First, the parameter set is estimated using simulated data (structural identifiability). Next, noise is added to find the maximum allowed experimental error under which the model can be reliably calibrated (practical identifiability). The questions to be answered are: are the current kernel formulations sufficiently identifiable? Is the intermediate data after the wetting zone needed to perform a calibration?

In the parameter estimated problem, an objective function is used to quantify a â€œdistanceâ€ between the measurement (in the calibration, this is measured particle size data; in the identifiability, this is synthetic data) and the model simulation. In this application field (including our previous work), the standard approach is to use a L2 norm or similar variant (sum of squared errors or root mean squared error). The question is if this distance is an ideal objective function for the comparison of two distributions? For comparing distributions, a plethora of choice is available: comparing the d10, d50 and d90 values, comparing the moments of the distribution, etc.

A number of different scenarios were worked out to assess the applicability of objective functions: Gaussians, Gaussian mixtures and distributions from simulated population balance models. These cover all the distributions encountered in this application field: unimodal as well as bimodal distributions. The parameter space of these parametric distributions was sampled to generate a visualisation of the objective functions. The assessment of the quality of an objective function was based on the ability to find the original parameter set, the smoothness around the optimum, the occurrence of local minima and the appearance of valleys or oscillations in the objective function.

Two key metrics were identified to perform optimal for distributions which are encountered in this application field: Kullback-Leibler divergence from information theory, also called the relative entropy and the Earth Movers distance from optimal transport theory. These metrics can now be applied in the parameter estimation problem of the identifiability analysis.

The results of the identifiability analysis state that the model from previous work ([1]) is usable, however, under certain constraints. Some parameters need to be constrained, others should be left out. The data in the wetting zone is crucial for understanding the dynamics in the system as well as for the calibration of the population balance model. This is an important result, as in recent years, more and more papers are being published on compartmental population balance modelling of the twin-screw wet granulator without using intermediate data, e.g. data inside the barrel: after the wetting zone or after the first kneading zone. This implies that the predictive power of such models is low and thus not usable in practical applications.

A high predictive power is the most important goal of these model efforts. As such, the identifiability procedure described in this work should be performed first before attempting a calibration. Without the confirmation of a sound identifiable model, there is no guarantee that the calibrated results can be used for further applications such as model predictive control or optimal experimental design.

[1] Van Hauwermeiren, D., Verstraeten, M., Doshi, P., am Ende, M. T., Turnbull, N., Lee, K., ... Nopens, I. (2018). On the modelling of granule size distributions in twin-screw wet granulation: Calibration of a novel compartmental population balance model. Powder Technology, 341, 116â€“125. https://doi.org/10.1016/j.powtec.2018.05.025