top of page

FOOD PROCESSING OPTIMIZATION

Ensuring Safety While Advancing Quality and Productivity: Proper food safety is a top priority for any food manufacturer, and regulatory and customer demands continue to drive companies to fine tune their processes. A key strategy for ensuring food safety involves implementing a kill step as a preventive control. Although it is critical to select the correct parameters to fully eradicate pathogens, it is also important to ensure the process does not yield other negative consequences. For example, when companies use conservative processing parameters as a margin of safety, building in extra time and higher temperatures to guarantee pathogen elimination, it can lead to over processing, which has sensory, energy use, and throughput implications. In addition, unnecessary processing can lead to the overuse of chemicals and increase the risk of residue. It can also cause further safety problems, such as the formation of acrylamide for some product categories. Fundamentally, an unsuitable or overly cautious kill step has the potential to compromise product quality, safety, taste, and appearance, as well as processing time, costs, and productivity.  

Control loop optimization in food and beverage manufacturing: small tweaks, big impact.

Process optimization is key in food and beverage manufacturing and control loops are the critical components. “Out of tune” loops can affect the quality of the product, the material and energy consumption, and ultimately increase the risk of contamination. This article demonstrates how Industrial AI and Machine Learning can be used for PID loop tuning to improve and optimize control loops to generate big savings and reduce risks.

Process complexity is obviously a key factor. Heat jacketed devices such as kettles, dryers, reactors or pasteurization units can be hard to control. When steam is used, the heat transfer is not uniform, which might result in an overshoot during uptimes, making the control loops difficult to tune. Note that this is less prevalent using water. Traditional cascaded loops will only solve part of the problem. An advanced analytics system such as Proficy CSense can help by looking at historical data to create a model of the actual profile and recommend new settings accordingly. The model will take into account the change of parameters such as viscosity and steam pressure which affect the heat transfer coefficient and the flow pattern. 

 

Analytics Software: How can process deviation be corrected more efficiently?

Proficy CSense is an all-in-one solution to determine and understand the causes of process deviation in industrial environments. Its advanced analytics capabilities enable engineers and data scientists to analyze, monitor, predict, simulate, and optimize and control set points in real time.

Proficy CSense includes two sets of capabilities: one for process modeling and troubleshooting, the second for online deployment and real time monitoring.

Data is prepared, visualized, and rules-based, data-driven process models can be constructed. Using these models, root causes of process deviations are identified, so processes can be optimized.

 

Case study: How a major F&B manufacturer is using advanced analytics and data-driven insights to optimize performance

The initial goal of this project was simply to deliver savings in raw materials. With GE Digital's software, this customer connected their machines to collect all the data and model their processes to visualize what was happening. They performed in-depth analyses on their combined raw materials, process and product quality data to understand the correlations and root causes of issues.  This revealed multiple insights into what was impacting the processes and how to improve them.

Using our Proficy CSense analytics solution, they managed to stabilize the production lines and the processes and find the “profit loops” by identifying which control loops were causing problems.

The result was an alignment of processes to where they needed to be, a 20% improvement in OEE, reduced product waste and raw materials costs, and higher quality products.

Using models to improve processes:
To optimize means to find the best solution, which in general implies finding the best compromise among several conflicting demands. A mathematical model is a representation of a real system which is usually focused on a set of selected properties and features of the latter. Models are the essential components of modern process systems engineering methods (i.e. simulation, optimization and control), and they are usually classified into three categories: First-principles (or white-box) models, which are derived from well known physical and chemical relationships, reflecting the underlying principles that govern the process behavior. Data-driven (or black-box) models, which are of empirical nature (e.g. artificial neural networks, time series). Hybrid (gray-box) models: a combination of the above. First-principles models are usually composed of macroscopic and/or microscopic balances for energy, mass and momentum, plus other relationships for kinetics, physical properties, etc.. Rigorously speaking, pure first-principles models are very rare, since there is always some sort of empirical relationship (e.g. for physical properties) present. Although first-principles models are highly desirable due to their nicer properties (e.g., for scaling or even for extrapolation), the complexity of most processes (and this is particularly true in the food industry) often makes their derivation a very hard and resource-consuming task. Thus, grey and black-box models can be a better alternative and in fact they are more popular for real applications.


RELEVANT OPTIMIZATION PROBLEMS.
Most models of food processing operations have an inherent dynamic nature, thus we use methods designed for the optimization of dynamic systems in order to arrive at the optimal decisions. There are three types of optimization problems which are especially relevant:

Computing optimal operating policies: that is, given a process dynamic model and a set of specifications, the objective is to compute the optimal operating conditions which lead to maximum performance as measured by some pre-defined criteria. These problems belong to the domain of dynamic optimization (or open loop optimal control).

Calibrating the models: i.e. the well known problem of parameter estimation (inverse problem, model calibration), that is, to find the parameters of a nonlinear dynamic model which give the best fit to a set of experimental data. This is one of the mandatory steps in dynamic model development, but unfortunately many modelers are not aware of the dangers of applying standard optimization methods for its solution.

Integrated process design: to find simultaneously the static design variables (e.g. sizes and number of units), the operating conditions (e.g. flows, temperatures, pressures) and other design issues (e.g. the controllers) which minimize capital and operation costs while optimizing certain characteristics of the dynamics of the process (e.g. maximizing controllability)

bottom of page