(658e) Methodology and Pitfalls When Calibrating a PBM: The Case of Twin-Screw Wet Granulation

Authors: 
Van Hauwermeiren, D., Ghent University
Kumar, A., Ghent University
Gernaey, K. V., Technical University of Denmark
De Beer, T., Ghent University
Nopens, I., Ghent University
Ghijs, M., Ghent University
Verstraeten, M., Ghent University
Doshi, P., Worldwide Research and Development, Pfizer Inc.
am Ende, M. T., Worldwide Research and Development, Pfizer Inc.
Lee, K., Pfizer Inc.
Turnbull, N., Pfizer Inc.
Traditional pharmaceutical processes comprise of a series of batch-wise operations. Nowadays, a shift is being made from these batch processes to continuous manufacturing, to cope with the inefficiencies and high cost involved in process development. Twin-screw wet granulation is an up-and-coming continuous granulation process that is being assessed for its performance in the solid dosage manufacturing. However, since these continuous processes are fairly new in the pharmaceutical industry, detailed process knowledge and understanding is still lacking. Application of mechanistic models can help in bridging this gap by assessing the experimental data and uncovering the underlying mechanisms. In this work, a Population Balance Model (PBM) is developed for predicting the granule size distribution inside the granulator starting from the pre-blend, up until the wet granules at the end of the process. In PBM, physical processes like aggregation and breakage of granules are represented by kernels which are often empirical in nature with fitting parameters. Model equations are solved using the Cell Averaging Technique (CAT) which is able to deal with aggregation and breakage. Different aggregation and breakage mechanisms can be implemented using different kernels. However, models are only useful when calibrated/validated using experimental data. Calibration of a PBM presents unique challenges in various intermediate steps.

First, the model grid needs to be defined (i.e. number and location of size classes). Different measurement techniques (laser diffraction, QICPIC, sieve analysis) use different grids. As each measurement technique has its own peculiarities, it is difficult to compare measurements performed with different techniques. Second, an objective function needs to be defined. The current practice to calibrate PBM is similar to how this was done for time series and using deterministic models. Sum of Squared Errors (SSE) and Root Mean Squared Error (RMSE) are thus typical gold standards. Similarly, the information of a whole particle size distribution can be condensed into some characteristic numbers, such as mode, mean, span, and the Sauter diameter. Proper evaluation is needed to confirm whether this is indeed the way to go to obtain a good modelling practise for PBM. A lot of freedom is available when dealing with particle size distribution, but the question is how to deal with this kind of freedom and how it is best coupled to the modelling objective. Third, a technique to find the minimal value of the objective function has to be selected. In this study, predictions of a selected PBM formulation are generated using a global parameter space exploration by means of a large set of Monte Carlo simulations. Changing the aggregation and breakage kernel parameters yields different size distributions. Different objective functions are evaluated to determine the objective function whose optimum yields the optimal agreement between the simulated and measured particles bearing in mind the objective. This optimal agreement, within a predefined error range, is called the calibrated model for that process setting. This study is repeated for different process settings of the twin-screw granulator. By comparing the calibrated results for different process settings, the aggregation and breakage mechanisms can be identified, and the most dominant regimes can be found. Guidance on the different choices to be made during the calibration process will be provided.