The Fallacy of the Golden Batch

By |2015-01-29T23:56:30-05:00January 29, 2015||

Golden BatchOver the years, batch automation has promoted the concept of a Golden Batch: a batch that progresses ideally and gives both excellent yields and final product quality.

Once such a dream batch has been made, replication of the “sweet spot” quickly becomes the focus. But replicating the golden batch’s outcome is much more complicated than simply following the same recipe for each batch.

    Interested in more information on the "Golden Batch"? Leave your contact information and ProSensus will be in touch with you.

    In this blog, we will highlight what we recommend in order to consistently produce high quality batches.

    The Key to a Golden Batch Outcome: Adapt.

    There are many sources of variation in a batch process from raw material variations, different batch durations, shifting environmental conditions and fluctuating plant utilities. Simply automating the batch process to replicate every aspect of the golden batch recipe will not consistently result in new batches with similar performance to the golden batch. This is because other key sources of variation, such as changes in raw material quality are not addressed.

    Hence, the best batch operating policy is not to try to replicate the “Golden Batch”, but rather to adapt to these changing conditions by using all available measurements throughout the batch to predict the final yield and quality and then to make small corrections to the trajectories at one or more decision points. This model-based control approach has led to significantly improved batch yields and final product quality.

    Multivariate Analysis can Effectively Define a Golden Region.

    In the world of analyzing large batch data sets, there have been tremendous advances using multivariate latent variable modeling methods (PCA & PLS). To identify the highly correlated trajectory variables and the initial condition data, our team models the process in a reduced dimensional latent variable space where often only 2 to 4 latent variables are necessary.

    This small number of latent variables is able to capture all of the important relationships between the data so you can easily identify the variations in batches by the plots of the latent variable scores providing the basis for SPC or monitoring of batches.

    By defining a region in the latent variable space that includes scores of all the acceptable batches, any new batch is then monitored by continuously updating the estimated latent variable scores as real-time data is obtained during the batch. If the updated scores project into the acceptable region defined by control limits, then it can be assumed that the batch is progressing in a manner consistent with past good batches.

    MVA software, such as Aspen’s ProMV (developed by ProSensus and sold to AspenTech in 2016), allows one to develop these models and set up the monitoring schemes. Once offline models have been developed and validated, they can then be implemented online to allow for real-time monitoring and prediction of final product quality and yield.

    These modeling and monitoring approaches are becoming widely used in the chemical, pharmaceutical, biologics and semiconductor industries.

    Real-time Batch Control Maximizes Yields & Product Quality while Reducing Costly Errors.

    Powerful new approaches to batch control have also been developed using latent variable models. By tracking the progress of every new batch, and making small corrections to the standard operation at a few critical decision points, you can maximize the projected yield and ensure the final product quality is well within acceptable bounds.

    Batch control models can be built in MVA software and embedded in process control systems for real-time implementation. These predictive batch control models are quite new and have only recently been deployed in the chemical, pharmaceutical and food industries.

    For example, in the production of snack foods a batch control system was introduced leading to an increase of 25-30% in productivity and reduced error deviations from target of 75% in all the quality attributes.

    Multivariate Analysis of Historical Batches

    The first step in manufacturing consistently high quality product is to start with an offline multivariate analysis of historical batches.

    Pending suitable variability in the historical data, the resulting multivariate models can then be used to monitor, predict and control processes in real-time. Implementations of this approach in industrial applications have demonstrated that significant batch process improvements are possible.

    And here’s the best part: the much sought after golden batch does not need to be available in your historical dataset to get started. Using multivariate analysis can improve your product quality and yields. If you’d like to get started with an offline model to assess feasibility of online batch control, contact us.

    About the Author: John MacGregor

    John MacGregor
    John MacGregor, Ph.D.
    Founder & Chairman
    John has dedicated the last 40 years to helping manufacturers improve and optimize their processes using multivariate data analysis. During his tenure at McMaster University, John authored over 200 peer reviewed journal publications in the areas of mathematical modeling of processes, optimization and control, rapid product development and image analysis.John has received many awards for his pioneering work in developing and applying multivariate analysis to solve complex manufacturing problems, including the Shewhart Medal and W.G. Hunter Award from the American Society for Quality, the Herman Wold Medal from the Swedish Chemical Society and the R.S. Jane Award and Century of Achievement Award from the Canadian Society for Chemical Engineering. In 2004, John founded ProSensus to help manufacturers learn more from their big data to increase yield, reduce operating costs, and improve product quality.