3,627
Views
15
CrossRef citations to date
0
Altmetric
Articles

Use of modelling and simulation in the pulp and paper industry

, , , , &
Pages 409-423 | Accepted 09 Sep 2009, Published online: 12 Nov 2009

Abstract

In pulp and paper the modelling and simulation of the pulp production processes was the first major application. Since then several other applications have been realised though papermaking has never been on the lead concerning the use of computer-based modelling and simulation. The complex nature of the materials is one of the most demanding challenges and the biggest hurdle for any electronic description of the papermaking process or paper itself. So other industries took the lead and it is now for the paper industry to learn from these industries what can be done with the help of computers to control or optimise processes or to design new grades.

The application of results gained with the use of modelling and simulation techniques in pulp and paper manufacturing processes has helped the industry to, for example, reduce emissions and increase the productivity and cost-efficiency of the processes. Still there are many important tasks open. A better understanding of the mechanisms of the processes and their control loops has to be achieved in order to further improve the paper quality, stabilise the wet-end chemistry, and enhance the runnability. Important features to be developed are new process designs, efficient process monitoring and systems that offer decision support during operation.

In this paper a review of the state of the art on modelling and simulation in the pulp and paper industry is presented as well as further research needs.

1. Introduction

The paper industry today is facing the challenges of being simultaneously flexible towards customers, efficient in the use of capital intensive assets and ecologically sustainable both when using resources and utilities and with respect to waste handling. However, the production lines are not flexible enough to meet today's requirements because of complex dynamics and rigid operational practises. The benefit of applying modelling and simulation techniques in pulp and papermaking manufacturing processes favour a better understanding of the mechanism of the processes and their control loops. This will allow papermakers to find solutions for currently pending problems in the paper industry, such as fast grade changes, improvement of paper quality, optimising the wet-end chemistry, enhancing the runnability and reducing emissions by improving process design, process monitoring and decision support during operation.

Although the pulp and paper industries have used balancing calculations and process control for a very long time, the scope of modelling and simulation applications is not as complete as it is in many other modern process industries. This is largely because of the complicated nature of the process in terms of raw material characteristics, the difficulty of applying real-time control tools to processes that incorporate substantial time delays, and the high degree of interaction between the various production processes. An added difficulty is that some key parameters and variables of the raw materials and products cannot be quantified quickly and automatically. Furthermore, the final quality of paper depends on many process elements in a complicated and nonlinear way and control actions are usually based on the skill and experience of the operators. However, the actual need for improving paper machine performance to increase the competitiveness of the mills make the application of operator decisions support tools necessary, based on dynamic optimisation, to allow process operators and engineers to manage complex dynamic problems in an efficient and ecologically sustainable manner.

Some of these techniques have already been explored in the pulp and paper industry for data analysis tools, for improved process efficiency, and for the development of soft sensors and stochastic distribution control algorithms. However, the complexity of the papermaking process makes results applicable only to individual cases. The task is still open to obtain a universal solution for the problems addressed.

As this is a key issue for the industry, the COST Action E36 ‘Modelling and Simulation in the Pulp and Paper Industry’ [Citation1] has been carried out in order to facilitate the coordination of activities and the exchange of knowledge at European level.

2. Modelling and simulation of pulp and paper production

Papermaking is a process where the large amount of water and the low efficiency of most of the separation steps make many recycling streams necessary in the process for water, fibres and fillers, air and energy. Thus, steady-state modelling requires very high efforts. The paper production process is highly dynamic and although many papers have been published in relation to dynamics, simulation and paper machines, only a few approaches to the dynamics of a paper machine at industrial scale have been published.

Heat balances as well as contaminant balances, for example, COD or macrostickies, have been developed during the last two decades. Industrial studies over other problems, for example, the simulation of the effects of various contaminants, such as the deposition of pitch and stickies, which involve more difficulty, have not been successful enough. Simulation of pulping processes has achieved further advances and can feed in some experience when handling multicomponent balances including complex chemical reactions. An improved steady-state modelling is still the key to a better design of pulping and paper production processes.

Wet end can be considered one of the most complex combinations of hydrodynamics and colloidal chemistry. No simulation model has yet been able to fully describe the processes taking place in the wet end. Some studies are focused on the dynamics of white water [Citation2] and on the dynamics of the wet end at the paper machine [Citation3]. Process simulation is usually an optimisation tool and only when the dynamics are well understood, a specific dynamic optimisation can be addressed [Citation4].

2.1. Availability of data as an important success factor

In any modelling and simulation project, the availability of data is a crucial factor regarding the quality of the results. Nowadays it is common that paper mills () have access to extensive amount of process data. Large investments have been made to install and operate databases, data management software and process monitoring systems. However, the value currently derived from these systems tends to be low. This puts pressure on the promoters of the investment and their common belief that this large amount of data will be of value in itself. In many cases this proves to be wrong.

Figure 1. Example of a paper mill.

Figure 1. Example of a paper mill.

Apart from the possible lack of expertise to analyse data properly, some common problems are that certain parts of the process are sometimes not available to the database system and that there are no sensors at the mill that can solve important specific technological questions. There are some cases in which the old-fashioned sampling campaign is still best practice [Citation5], especially when dealing with complex problems at, for example, wet end and when setting up mill balances for pulp, water, energy and detrimental substances.

2.2. Methods for the analysis of data

Advanced data treatment possibilities at process control and optimisation are not always well known or exploited by papermakers. Prior classification of data before analysis is of great help in order to get an answer to a specific question. The final aim of the study will determine the analysis techniques that can be applied from the many different existing ones, methods which are quite often complementary:

  1. Classic statistical analysis techniques, time series analysis, experimental design, D-optimal design and trial-error methods are often used for quick and adequate solution of specific short-term and usually isolated problems [Citation6].

  2. Physico-chemical deterministic model is the optimum approach when a broad-range stable adequate response is needed when modelling the overall behaviour of the process or one of its sections. Typical applications of this sort of models are the design of new equipment or the study of the kinetics of flocculation [Citation7–9].

  3. The third approach is a combination of approaches 1 and 2. Sometimes the behaviour of the process, section or equipment to model (the physics and/or chemistry of involved phenomena) is complex and unknown, but a general model has to be built to predict process evolution, optimise process variables or simulate scenarios. In this case, advanced data analysis tools, for example, multivariate analysis (MVA) techniques and artificial neural networks, very often provide models with better performance than techniques from approach 1 [Citation10–13].

MVA techniques are quite useful when a problem with a high number of interrelated variables has to be analysed [Citation14–15]. They allow a huge amount of data to be handled to obtain results that can help researchers to improve the process. They can be classified in several groups:

  • Cluster analysis where each cluster represents similar data. It is useful for classification.

  • Dimension reduction techniques. In this group, for instance, the maximum likelihood common factor analysis and the principal component analysis (PCA) can be found.

  • Other multivariate methods. The multivariate analysis of variance, the canonical correlation analysis, partial least squares (PLS), etc.

There are works that combine different advanced modelling techniques. Miyanishi and Shimada [Citation11] used MVA techniques together with artificial neural networks (ANNs). These tools can be used for advanced control in paper and board making, and they have been described by Wang et al. [Citation16].

ANNs have been defined as ‘data processing systems consisting of a large number of simple, highly interconnected processing elements (artificial neurons) in an architecture inspired by the structure of the cerebral cortex of the brain’ [Citation17].

ANNs are one of the existing tools related as ‘Artificial Intelligence’. Artificial intelligence is a knowledge field where mathematical modelling tools based on biological structures and behaviours are used for data screening, prediction of process variables and product properties, process modelling and so on.

Some examples can be found in the literature. For instance, both coated board brightness and the use of additives have been optimised using genetic algorithms, ANNs and experimental design [Citation18].

Feed-forward ANNs with backpropagation algorithms have been the most used in the pulp and paper industry [Citation10–11,Citation18–22]. Monitoring and prediction of emissions [Citation10,Citation23], product quality predictions [Citation18,Citation20,Citation24] or process optimisation [Citation11,Citation21,Citation22,Citation25] have been some of the purposes when using ANNs in this way. In all those cases, results when working with ANNs were considered successful.

2.3. Modelling and simulation software

Nowadays, the amount of existing software packages that are able to perform modelling and simulation in the pulp and paper industry is high. A recent COST report also gives an overview of the current use of software by COST E36 members [Citation26]. A list of simulation software packages used is shown in .

Table 1. List of simulation software packages used for the pulp and paper industry [Citation26]

They can be subdivided into three different groups, depending on their simulation fields, capabilities and internal functions:

  • System or process level modelling: Object oriented software is used to do flow sheet-based mass balance calculations. Balances and functionalities are often based on first principles and include water, pulp, energy, etc. This is the approach on process description and simulation most commonly in use. The approach on process optimisation is still empirical in most cases. A great variety of software packages are available; Klemona and Turunen have published a complete report on Finnish modelling and simulation software [Citation27]. The unit operations are defined as objects in libraries that can be positioned on a worksheet. The relationships between the objects are defined by lines drawn between the objects. The functionality is defined within dialogues of the individual objects. The equation system built in the background is typically calculated with a sequential or a matrix solver [Citation15,Citation28].

  • Unit operation level: high fidelity models of single pieces of equipment or smaller functional groups. Typical examples are paper machines dryer models. These pieces of software are typically written in plain code like FORTRAN or C++. They either originate from the scientific or the equipment suppliers side. As new trend suppliers try to incorporate these into the flow sheet-based software. A breakthrough was achieved by Andritz linking its sub models as dll files to the process model and thus providing the full functionality and at the same time safeguarding the undisclosed code [Citation29]. In line with this, big equipment suppliers like Andritz and Metso have bought companies that develop flow sheet oriented software. Only a few software suppliers are independently active on the market.

  • Software for data analysis and data-based modelling and simulation. This type of software is typically applied in cases where no physical models are available or even the question of cause and effect is unsolved. Data analysis followed by modelling can be performed as described in this chapter. Typical examples are wet-end chemistry modelling works. A typical example for this type of software is Matlab (mathworks) [Citation15,Citation30], which allows the creation of user functions with custom options.

2.4. Off-line use of simulation and simulation-based optimisation in the pulp and paper production processes

Simulation is an integral part of process and control design in the process industry, including the pulp and paper industry. The level of detail in simulation in pulp and paper processes varies from computing small-scale chemical reactions to mill-wide process calculations. In many cases, we need to reproduce some or all aspects of the time-dependent behaviour of the real process. provides a rough classification of the most common modelling and simulation tools with respect to dynamic abilities and level of detail.

Figure 2. Classification of off-line modelling and simulation tools.

Figure 2. Classification of off-line modelling and simulation tools.

Design problems constitute to the majority of applications of off-line simulation in pulp and paper production processes. These range from defining elementary mass and energy balances for a process concept with steady-state simulation to using dynamic simulation to integrated design of process and control [Citation31].

Unit operation development is often supplemented by model development. In this application, models are not necessarily used for getting quantitative results, but rather for increasing knowledge on the underlying phenomena behind the unit operation. Computational fluid dynamics (CFD) is often used for these calculations to, for example, optimise the internal geometry of process equipment. Examples include the use of CFD for head-box design [Citation32] and wet-press models [Citation33,Citation34] for press section.

The second application is to use modelling and simulation for off-line process analysis. Dynamic simulation is used for process analysis and troubleshooting to identify the sources of disturbances and their propagation in the process. Dynamic capabilities, as well as realistic control system models are essential in order to capture the time delays and to analyse the abilities of the process and control to dampen the disturbances. illustrates potential sources of disturbances and their typical time scales at a paper mill [Citation35].

Figure 3. Typical time scales of process disturbances. Adapted from Cutshall [Citation35].

Figure 3. Typical time scales of process disturbances. Adapted from Cutshall [Citation35].

Applications include process operation improvement, like optimisation of grade changes [Citation36]. As design and operational problems are often multi-objective by nature, multi-objective optimisation techniques have been lately applied to pulp and paper problems [Citation37]. These techniques allow optimising simultaneously against several criteria and finding balanced solutions without having to constrain some of the criteria a priori. One of the most important design and operation criteria is product quality, and models tying the process conditions to the end-product property are still very much under development as mentioned earlier. Statistical models have been built to estimate the effect of process conditions on the product properties, but due to their nature they are suited for optimisation of an existing pulp or paper production line where enough data can be collected for the models [Citation13,Citation38].

One of the complicating facts in defining mechanistic quality models is the complex chemistry of pulp and papermaking. In addition to already referenced papers, some additional off-line applications of simulation models are: maximising total profitability with dynamic production planning, by, for example, combining production planning and scheduling tools with cost management [Citation39].

It has to be noted that the above-mentioned examples have been solved using different types of models and tools, and so far no single general tool or set of models exist that can be used for every type of problem. Instead of attempting to build a general tool, which would be non-optimal for most applications, information and model transfer between the tools should be improved to avoid tedious re-defining of data.

Often process models are built for a specific purpose only, for example, process retrofit and are not utilised later. The biggest advantage of models would be obtained, if they were utilised through the whole life cycle of the plant. The plant model would be then built in the design phase of the project and subsequently used to:

  • Check the dimensioning of equipment and the feasibility of control loops.

  • Training operators on a system that is based on the simulation model.

  • The distributed control system (DCS) logic verification, DCS checkout can be performed based on the model connected to the DCS instead of the real process.

The benefit of this is a rather steep start-up curve providing the payback of an investment of 0.5–3 M$ [Citation40]. Further use could be made in using the model to check online sensors and identify the deviation from the set point of individual equipment or the process as a whole. This would call for the model run in real time in parallel to the process itself. Such a model, which covers the whole life cycle of the plant, requires a well-defined and uniform data structure behind it to incorporate both design and operational data and to communicate with process design tools, automation systems, data repositories, etc.

2.5. Online use of simulation and simulation-based optimisation in the pulp and paper production processes

Because of the complexity of the pulping process and the dynamics of the paper production process, process control systems have widely been established in the pulp and paper industry. For a typical application the number of the I/O connections can vary between 30,000 and more than 100,000. In most cases conventional control technology is used. Operators see the actual values displayed on process displays; proportional–integral–derivative (PID) controllers help to operate the plant.

Various approaches can be followed to manage these complex systems in an optimal way. The first issue to be addressed is how to handle the huge amount of data available within the system. Fast data acquisition, high dimensional data analysis and dimensional reduction play a major role in providing the right data set. In addition new sensors that have become available within the past few years have to be evaluated as additional sources of information.

Based on the available and digested information two approaches are to be followed depending on the issue addressed. The first approach is an open loop decision-making supported with simulation. Nonlinear system modelling combined with multivariable system optimisation is one of the basic principles to be used here. Models are calculated based upon calibration with real data scenarios. The results are available to the operators and the engineers who will then decide about the next steps to be followed as shown in .

Figure 4. Modelling and simulation as decision-making support tools.

Figure 4. Modelling and simulation as decision-making support tools.

The second approach is closed loop control. The performance of all quick control functions – currently carried out with simple PID controllers – is to be evaluated. Then, it can be decided whether there is a potential for improvement with advanced control techniques, such as multivariable process control.

Up to now few online simulation applications have been realised in industry. Most of the software tools have been developed in other process industry sectors. The petrochemical industry is the leading in applying simulation online. Applications in the paper industry are the multivariate model-based optimisation (MIMO-MPC) of paper machine quality control (basis weight, filler, moisture, etc.) in both machine and cross direction that is currently widely implemented in commercial systems (Metso, ABB and Honeywell). The optimisation of basis weight, coating weight and moisture in the cross direction of the sheet were actually one of the first (if not the first) high-dimensional MIMO-MPC even though the model of dynamics was rather simple.

These applications serve as a good starting point for optimisation based on advanced simulation. Such ideas have recently been explored in an EU-funded research project [Citation41]. The main goal, therefore, is to identify new applications and to use all know-how concerning the dynamic description of papermaking processes to adapt other available solutions to the paper production process.

Some examples described in the references are as follows:

  • Wet-end stability: multivariable predictive control to model the interactions in the wet end and in the dry section in order to reduce sheet quality variations by more than 50% and to keep quality on target during long breaks, during production changes and after broke flow changes. It is achieved by coordinating thick stock, filler and retention aid flows to maintain a uniform basis weight, ash content and white water consistency [Citation42].

  • At an integrated mill in Sweden multivariate analysis (PLS) was used to form a model to predict 15 different paper quality properties. First 254 sensors were used, but 80% of them turned out to be unreliable due to poor calibration, so only 50 were used. After a further analysis we identified 12 variables as most important and 5 of these were varied in a systematic way to form good prediction models for the 15 quality variables. The sensors for the 12 most important variables were calibrated frequently during the experiments. The models were made by measuring online data for a certain volume element of fibres flowing through the plant and by adding off-line quality data for the final product. The model was then used online as a predictor to optimise the production. The problem was that the models started to drift after some weeks and became not as good. This is a problem with statistical black box models at least for many variables [Citation43].

  • An alternative is then to use a grey box model approach, where the model building starts with a physical model, which is tuned with process data. This type of model was shown to give very reliable values on ‘ring crush test’ on liner board for several years without any need for recalibration and was thus used as part of a closed loop control at several mills [Citation44].

  • A physical deterministic digester model was made including chemical reactions in a continuous digester as a function of temperature, chemical concentration and residence time. Circulation loops and extraction lines were included as well. This was used as part of an open loop MPC and the set points were implemented by the operators manually. The optimised production increased earnings of 800,000 $US/year in relation to if the ‘normal production recipe’ had been used for the same wood quality and production rate [Citation45].

  • Web-break diagnosis: it was carried out using feed-forward ANNs and PCA. Variable selection and further modelling resulted in several improvements: first-pass retention and first-pass ash retention improved by 3 and 4%; sludge generation from the effluent clarifier was reduced by 1%; alum usage decreased and the total annual cost reduction was estimated around 1 M[euro] [Citation11].

  • At a Spanish paper mill, multiple regression techniques and ANNs are being used to predict paper quality. From an initial set of more than 7500 process parameters, only 50 were pre-selected for modelling. Statistical analysis has allowed reducing the number of model inputs to less than 10. Predictions for different quality parameters have been very accurate (e.g. R2 for paper formation predictions over 0.74). The next step is to optimise model usefulness and robustness through appropriated validation procedures and reduction of the amount of inputs [Citation13].

In dynamic optimisation carried out in parallel with the process, the quality of online data and the dynamic validity of the models are of utmost importance. Quality of data is severely compromised by the slow changes of the characteristic curve relating the signal (in 4–20 mA) to the physical measured, for example, consistency. The common practice for maintaining measurement quality is that occasionally samples of the processed intermediate or final product are taken to laboratory to be measured. When the online measurement deviates considerably from the laboratory value, the characteristic curve is updated for better correspondence. Unfortunately, the updates are made in a rather haphazard manner. Recently, methods to systemise the updating [Citation46] or detecting the need for proper determination of the characteristic curve [Citation47] have been presented.

As data available is extensive, the probability of a data set containing false or missing values is rather high, although the reliability of the sensors was high. For example, if the reliability of sensors was 99%, the probability that a set of 1000 measurements would not contain false values at any given time is only 8%. Therefore any extensive data analysis or simulation system must be able to detect false values and then be able to provide the users with analysis results and predictions with such incomplete data.

We will now discuss a specific application more in detail. It concerns the online control of continuous digesters. A physical model has been built. In this case we used the Modellica/Dymola environment. The model is getting input data from the DCS system (process computer) and sending back set points to the DCS. The digester that is 60 m high and has a diameter of approximately 6 m is divided into five sections, each with a further division into five volume elements in the vertical direction and five in the horizontal, from the inlet pipe to the screens. There are a number of recirculation loops in the digesters as well. The model is made as a two-dimensional (2D) model, with the volume elements in a plane. It can principally be expanded to a 3D model, but due to calculation time limitations we have to stick to the 2D for the time being.

In each volume element i we do a mass and energy balance from the flow in and out as well as due to the chemical reactions with rates depending on physical dimensions of the wood chips, chemical concentrations and temperature. The digester model is built on the same principle as the Purdue model presented in, for example, Bhartiya et al. [Citation48]. The chemical reactions include first breaking the bonds between lignin and fibres by [HS] followed by extraction of the lignin in an alkaline environment depending on basically [OH]. This process depends on the reactivity and properties of the components in the wood chips and given by an empirical constant C 1 for each chip size distribution and wood species. The reaction rate will also depend on the temperature, T, which is also experimentally given and shown as an Arrhenius expression with the constants A and B. The dissolution rate for the lignin, dL/dt, now is given as

The total amount of lignin transferred from the wood chips in volume element i to the solution during the time step Δt is given as

The concentration of dissolved lignin (DL) in the free liquor then will change in the volume element i according to the sum of the flows in of DL Σin DL i−1,t minus the flows out Σout DL i,t plus the dissolution from the wood chips (Li ,t+1 – L i,t ) during Δt. For the lignin bound in the chips, we will get the same but for the solid phase and with the opposite sign for the lignin.

DL and L here are given as kg/s.

The energy balance is given in a similar way but using temperatures and enthalpies of flows in and out as well as changes due to chemical reaction.

Today the input to the model is the flow of chips, size distribution, chemical additions and temperature, and predictions are made for the dissolution of lignin primarily. We then measure the actual kappa number of the fibres leaving the digester and, from the result, we can adjust both the model constant C 1 for a specific wood species and fibre size distribution, as well as adjust the temperature and the chemical additions in an optimal way [Citation49]. In the future we also intend to measure the wood quality of what is going into the digester by near infrared (NIR). The NIR spectrum is calibrated towards the calculated C 1 for the specified wood quality going to the digester. This will mean that we can do a good feed-forward control of the complete digester in the future. This was implemented in two mills in Sweden and South Africa during 2007. The model also will be used for diagnostics of the performance of the digester. In this case we also include pressure-flow calculations in the digester.

3. Research needs

An important challenge for modelling is the prediction of paper properties based on data about process variables and simulated constitution of the paper. Process simulation provides typically a concentration vector c at the wire section and/or dry end. The consistencies simulated depend on application, but would typically include separate fibre and filler concentrations and possibly the concentrations of fibre fractions. A paper property Q, such as strength or opacity, is a function of the constitution of paper and process variables p, that is,

(1)

In order to develop control strategies for paper properties or to optimise the properties online such models must be known. However, the process variables work their effect on paper quality through the structure of paper with mechanisms poorly known. Therefore developing quality models through physical models has been extremely slow – actually all approaches are generally data based. Specific outcomes currently available are several soft sensors and model-based control loops operated in paper mills as described. These projects accumulate complex knowledge about the way in which the process impacts product quality. To date these projects are individual approaches. An overview and a holistic scientific approach addressing the relationship between process variables and fibre network structure and between network structure and paper properties is largely missing, thus being an important future task for research.

The behaviour of fibres and fillers throughout the process is quite well understood and implemented in present simulators. However, the constitution has other components that are highly relevant when predicting paper properties, such as many of the additives, and some components, such as retention aids, affecting strongly the behaviour of the constituents. Therefore there is a need to expand the concentration vector c to cover the chemical additives. The interactions between constituents mean that the models of separation processes become increasingly complicated: a simple division matrix approach must be replaced by a nonlinear model

(2)

With appropriate description of paper constituents and their dynamics, EquationEquation (2), establishing generic property models, EquationEquation (1), requires that we know how the properties of constituents affect paper property.

However, in spite of active research in which the process variables are eliminated, for example, by laboratory sheet forming or else constant web formation conditions, none of the research teams has reached a satisfactory level of results. Clearly this is a research area in which a high number of projects are required to cover ground unknown today. One hindering factor for an open scientific exchange in this field seems to be that any breakthrough possibly achieved would give any industrial/scientific player a big edge over its competitors.

As pulp and paper is based on natural materials processed in a complex way, variations both in process behaviour and in paper properties is an area to be addressed, so that the variability can be reduced and product quality stabilised by optimal actions that are model based. This requires extensive statistical and dynamic analysis of process data.

The modelling and optimisation methods currently used in improving efficiency of paper production can be extended to the pulp and paper value chain. The units process models of supply chain are typically simpler, the structure of the system more complex and integration of cost models is required. The resulting operational and/or design optimisation problems will be very large but also the economic opportunity is much larger than with production processes in which streamlining and efficiency improvement activities have been carried out for a long period of time.

As a result of a workshop organised by the COST Action E36 research needs of the pulp and paper industry have been identified by scientists representing the most important research capabilities in Europe in this field. The following topics have been named [Citation50]:

  • A fingerprint approach on the process state as key to process stability and reproducibility.

  • Designing a handheld device to support the operator to take optimum decisions.

  • Decision support for higher energy efficiency.

  • More robust models – a move from statistical to physical models.

  • Safeguarding the validity of advanced process control models over the full life cycle.

  • Holistic modelling approach to optimise the pulp and paper value chain.

Summarising these ideas it can be said, that from the viewpoint of research one of the most important subject to be addressed is the development of tools in order to support paper machine operators in evaluating and optimising the state of the process concerned. On the other side more effort has to be put into more reliable and capable modelling approaches themselves. A more visionary aspect of research that has been named is the concurrent design of material and information flows. This is aimed at an integration of the whole supply chain and thus addressing optimisation of the entire supply chain eventually.

4. Conclusions

An optimal use of resources results in optimal mill operations and in an increase of the productivity, which means substantial financial benefits. While staff experience is the main base for existing control methods, the inclusion of models providing a new generation of process predictions and controls would be of considerable value. Therefore, modelling and simulation are important tools to reach the two primary goals of the pulp and paper industry: the decrease of production costs and the increase of the product added value.

During recent years many applications for improving the process performance based on controlling different process variables have been implemented at industrial level. However, the development of models to predict and optimise product properties based on process variables is still in progress. The main difficulty is that papermaking is a highly dynamic process and, in particular, the dynamics of the wet end is still not fully understood.

Looking into the future of a knowledge-based process, it seems that the interest on model-based control and optimisation will increase. Modelling and optimisation of paper properties and model-based paper design will have a very high increase, whereas process simulation and off-line optimisation will remain the same from the production point of view. However, the equipment and chemical suppliers and consultants will increase the use of these tools to develop the processes and control strategies.

References

  • COST E36, 2005. Available at http://www.costE36.org/
  • Orccotoma , J. A. , Paris , J. , Perrier , M. and Roche , A. A. 1997 . Dynamics of white water networks during web breaks . Tappi J. , 80 ( 12 ) : 101 – 110 .
  • Hauge , T. A. , Slora , R. and Lie , B. 2005 . Application and roll-out of infinite horizon MPC employing a non-linear mechanistic model to paper machines . J. Process Control , 15 ( 2 ) : 201 – 213 .
  • Laperriere , L. and Wasik , L. 2002 . Implementing optimisation with process simulation . Tappi J. , 85 ( 6 ) : 7 – 12 .
  • Hutter , A. and Kappen , J. 2004 . “ Analyse von Wasserkreisläufen ” . In Wasserkreisläufe in der Papiererzeugung. Verfahrenstechnik und Mikrobiologie , Edited by: Kappen , J. and Pauly , D. Munich : Papiertechnische Stiftung .
  • Box , G. , Hunter , W. G. and Hunter , J. S. 1978 . Statistics for Experimenters , 1st , New York : John Wiley & Sons, Inc .
  • Blanco , A. , Fuente , E. , Negro , C. and Tijero , J. 2002 . Flocculation monitoring: focused beam reflectance measurement as a measurement tool . Can. J. Chem. Eng. , 80 ( 4 ) : 734 – 740 .
  • Negro , C. , Fuente , E. , Blanco , A. and Tijero , J. 2005 . Flocculation mechanism induced by phenolic resin/PEO and floc properties . AIChE J. , 51 ( 3 ) : 1022 – 1031 .
  • Thomas , D. N. , Judd , S. J. and Fawcett , N. 1999 . Flocculation modelling: a review . Water Res. , 33 ( 7 ) : 1579 – 1592 .
  • Masmoudi , R. A. 1999 . Rapid prediction of effluent biochemical oxygen demand for improved environmental control . Tappi J. , 82 ( 10 ) : 111 – 119 .
  • Miyanishi , T. and Shimada , H. 1998 . Using neural networks to diagnose web breaks on a newsprint paper machine . Tappi J. , 81 ( 9 ) : 163 – 170 .
  • Parrilla , R. and Castellanos , J. A. 2003 . Aplicación del control predictivo en la parte húmeda de la máquina de papel . Papel , 108 ( 11 ) : 63 – 67 .
  • Blanco , A. , Alonso , A. , Negro , C. and San Pio , I. May 2005 . Advanced data treatment to improve quality in a newsprint machine , May , 18 – 20 . Barcelona : International Symposium IPE-PTS-CTP, Spain .
  • Martens , H. and Naes , T. 1998 . Multivariate Calibration , Chichester : John Wiley & Sons .
  • Matlab 6.5 R13 . 2002 . User's Guide , Natick : The Mathworks Inc .
  • Wang , H. , Wang , A. P. and Duncan , S. R. 1997 . Advanced Process Control in Paper and Board Making , Surrey, , UK : Pira Technology Series, Pira International .
  • Tsoukalas , L. H. and Uhrig , R. E. 1997 . Fuzzy and Neural Approaches in Engineering , New York : Wiley Interscience .
  • Kumar , A. and Hand , V. C. 2000 . Using genetic algorithms and neural networks to predict and optimize coated board brightness . Ind. Eng. Chem. Res. , 39 ( 12 ) : 4956 – 4962 .
  • Aguiar , H. C. and Filho , R. M. 2001 . Neural network and hybrid model: a discussion about different modeling techniques to predict the pulping degree with industrial data . Chem. Eng. Sci. , 56 ( 2 ) : 565 – 570 .
  • Broeren , L. A. and Smith , B. A. 1996 . Process optimization with neural network software . Prog. Pap. Recycling , 5 ( 2 ) : 95 – 98 .
  • Campoy-Cervera , P. , Muñoz-García , D. F. , Pena , D. and Calderón-Martínez , J. A. 2001 . Automatic generation of digital filters by NN based learning: an application on paper pulp inspection . Lect. Notes Comput. Sci. , 2085 : 235 – 245 .
  • Vaughan , J. S. , Gottlieb , P. M. , Lee , S. -C. and Beilstein , J. R. 1999 . The Development of a Neural Network Soft Sensor for Chlorine Dioxide Stage Brightness Control , 147 – 159 . Atlanta, GA, , USA : TAPPI 99 ‘Preparing for the next millennium’ . Book 1, ISBN 0-89852-734-1, 1–4 March
  • Smith , G. C. , Wrobel , C. L. and Stengel , D. L. 2000 . Modeling TRS and sulphur dioxide emissions from a Kraft recovery boiler using an artificial neural network . Tappi J. , 83 ( 11 ) : 69
  • Joeressen , A. 17–18 September 2001 . Predicting corrugated box compression values through innovative software , 17–18 September , 4 Manchester, , UK : Developments in manufacture, technology and markets for corrugated board .
  • Chen , J. and Liao , C. M. 2002 . Dynamic process fault monitoring based on neural network and PCA . J. Process Control , 12 : 277 – 289 .
  • A. Alonso, A. Blanco, and C. Negro, Current use of software in COST Action E36, COST Report, 2005. Available at http://www.costE36.org
  • Klemona , K. and Turunen , I. 2001 . State of modelling and simulation in the Finish process industry , Helsinki : Universities and Research Centres, TEKES, Technology Review 107/2001 .
  • Goedsche , F. and Bienert , C. 2002 . On-line prozessanalyse und computer simulation von Herstellungprozessen . Project report , 37
  • Mayböck , R. . The DLL wrapper: the Andritz vision for advanced process control . IDEAS User Group Conference (UGC) . November 4–5 .
  • Mathworks, 2005. Available at http://www.mathworks.com
  • Kokko , T. 2002 . Development of papermaking process based on integrated process and control design , Finland : Doctoral thesis, Tampere University of Technology .
  • ämäläinen , J. H and Tarvainen , P. 2002 . CFD-optimized headbox flows . Pulp Pap. Canada , 103 ( 1 ) : 39 – 41 .
  • Gustafsson , J. E. and Kaul , V. 2001 . A general model of deformation and flow in wet fibre webs under compression . Nordic Pulp Pap. Res. J. , 16 ( 2 ) : 149 – 155 .
  • Honkalampi , P. and Kataja , M. 2002 . Dry content analysis in wet pressing: sensitivity to pressing variables . Nordic Pulp Pap. Res. J. , 17 ( 3 ) : 319 – 325 .
  • Cutshall , K. 1997 . Nature of Paper Variation , USA : TAPPI Wet End Operations, Short Course .
  • Lappalainen , J. , Myller , T. , Vehviläinen , O. , Tuuri , S. and Juslin , K. 2003 . Enhancing grade changes using dynamic simulation . Tappi J. Online Exclusive , 2 ( 12 )
  • Hakanen , J. , Mäkelä , M. , Miettinen , K. and Manninen , J. 6–11 August 2004 . On interactive multiobjective optimization with NIMBUS in chemical process design , 6–11 August , Whistler, B. C, , Canada : MCDM 2004 .
  • Scott , W. and Shirt , R. . Potential application of predictive tensile strength models in paper manufacture: Part II – Integration of a tensile strength model with a dynamic paper machine material balance simulation . TAPPI Papermakers Conference . pp. 879 – 887 .
  • änni , P. V and Launonen , U. 2005 . Maximising total profitability with dynamic production planning . Appita J. , 58 ( 3 ) : 177 – 179 .
  • Bogo , A. . How IDEAS helped manage our project risk . IDEAS User Group Conference (UGC) . 4–5 November
  • Ritala , R. , Belle , J. , Holmström , K. , Ihalainen , H. , Ruiz , J. , Suojärvi , M. and Tienari , M. . Operations decision support based on dynamic simulation and optimization . Proceedings, Efficiency, PulPaper 2004, Conferences, – Energy, Coating, Efficiency – The Finnish Paper Engineers’ Associaton – PI . June 1–3 , Helsinki, Finland. pp. 55 – 62 .
  • M. Williamson, 2005. Available at http://Paperloop.com
  • Dahlquist , E. , Ekwall , H. , Lindberg , J. , Sundström , S. , Liljenberg , T. and Backa , S. 1999 . On-line characterization of pulp – stock preparation department , Stockholm : SPCI .
  • Dahlquist , E. , Wallin , F. and Dhak , J. 2004 . Experiences of on-line and off-line simulation in pulp and paper industry . PTS-symposium in Munich , March : 8 – 9 .
  • Jansson , J. and Dahlquist , E. September 2004 . Model based control and optimization in pulp industry , September , 23 – 24 . Copenhagen : SIMS 2004 .
  • Latva-Käyrä , K. , Ihalainen , H. and Ritala , R. 2006 . Dynamic validation of on-line measurements: a probabilistic analysis . Measurements , 39 ( 4 ) : 33 – 51 .
  • Latva-Käyrä , K. and Ritala , R. . Sensor diagnostics based on dynamic characteristic curve estimation . 10th International Conference on Technical Diagnostics . June 9–10 , Budapest, Hungary.
  • Bhartiya , S. , Dufour , P. and Doyle , F. J. . Thermal-hydraulic modeling of a continuous pulp digester . Proceedings from Conference on Digester modeling in Annapolis . June ,
  • Avelin , A. , Jansson , J. and Dahlquist , E. Use of mathematical models and simulators for on-line applications in pulp and paper industry . Conference Proceedings MathMod 2006 in Vienna .
  • Research activities: collaboration within new research projects, Tampere: Workshop June 2006. Available at http://www.coste36.org/docs/E36_research-meeting_Tampere_minutes.pdf

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.