Next Article in Journal
Numerical Simulation of Non-Linear Models of Reaction—Diffusion for a DGT Sensor
Next Article in Special Issue
Numerically Efficient Fuzzy MPC Algorithm with Advanced Generation of Prediction—Application to a Chemical Reactor
Previous Article in Journal
A Hybrid Grasshopper Optimization Algorithm Applied to the Open Vehicle Routing Problem
Previous Article in Special Issue
Comparison and Interpretation Methods for Predictive Control of Mechanics
 
 
Order Article Reprints
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Performance Assessment of Predictive Control—A Survey

Institute of Control and Computation Engineering, Warsaw University of Technology, ul. Nowowiejska 15/19, 00-665 Warsaw, Poland
Algorithms 2020, 13(4), 97; https://doi.org/10.3390/a13040097
Received: 15 March 2020 / Revised: 8 April 2020 / Accepted: 10 April 2020 / Published: 17 April 2020
(This article belongs to the Special Issue Model Predictive Control: Algorithms and Applications)

Abstract

:
Model Predictive Control constitutes an important element of any modern control system. There is growing interest in this technology. More and more advanced predictive structures have been implemented. The first applications were in chemical engineering, and now Model Predictive Control can be found in almost all kinds of applications, from the process industry to embedded control systems or for autonomous objects. Currently, each implementation of a control system requires strict financial justification. Application engineers need tools to measure and quantify the quality of the control and the potential for improvement that may be achieved by retrofitting control systems. Furthermore, a successful implementation of predictive control must conform to prior estimations not only during commissioning, but also during regular daily operations. The system must sustain the quality of control performance. The assessment of Model Predictive Control requires a suitable, often specific, methodology and comparative indicators. These demands establish the rationale of this survey. Therefore, the paper collects and summarizes control performance assessment methods specifically designed for and utilized in predictive control. These observations present the picture of the assessment technology. Further generalization leads to the formulation of a control assessment procedure to support control application engineers.

1. Introduction

Modern control systems are organized into a hierarchical structure, often presented in the form of a functional pyramid, as shown in Figure 1. The targeted plant of the control system is situated at the bottom. An instrumentation layer is atop the process layer, which enables the upper levels to communicate with the process layer. The regulatory control layer is organized into basic univariate loops that mostly utilize the PID algorithm. This algorithm constitutes a significant majority (>90% or even >95%) of the algorithms utilized [1,2,3]. These percentages might seem strange, but more advanced control techniques, such as Kalman filtering, optimal, robust, predictive, and adaptive algorithms, are only used in a limited number of applications, such as military, expensive processes, nuclear plants, or similar applications. This is a small fraction of the applications compared to civilian industry plants, and the selected MPC applications (even large ones) in the process industry are not comparable to the PID applications, and the applications can be measured in the billions [4]. Moreover, these applications come in the form of supervisory implementations over regulatory PID loops. In such a case, the PID loops are not replaced, but are still used. The majority of control applications are not very complex and not very demanding. Actually, most PID controllers only use the PI elements.
However, the improvements achievable with the PID control rules are limited. More complicated controllers, such as multivariate, nonlinear, predictive, adaptive, or ones using soft computing, fall under the general term of Advanced Process Control (APC) [5,6]. They go beyond the scope of operation of PID loops. The majority of the Model Predictive Control (MPC) implementations are situated here, although, in some cases, MPC plays the role of a regulatory control without any downstream PID loop. The supervisory level consists of Process Optimization (PO), economic planning, and long-term scheduling.
A properly designed and tuned controller allows high operational performance to be achieved [7]. A poorly tuned or improperly selected control philosophy results in worse total process performance. Furthermore, real, most full-scale industrial plants are non-stationary, nonlinear, and complex. The owner of the installation is not only interested in reaching a single shot result, but it is also expected to improve sustainability. On-line performance monitoring, diagnostics, and maintenance play increasingly important roles and constitute inevitable aspects of good practices on site. These aspects appear at the PID regulatory level, but they are crucial for APC [8,9] solutions, as advanced controls mostly operate close to the technological constraints. It is expected that the base control maintains the operation in automatic (AUTO) mode, while APC is aimed at additional financial benefits.
Advanced control techniques are becoming more and more popular. MPC is the main component and is often synonymous with APC. A predictive strategy with a receding horizon computes the control signal, called the Manipulated Variable (MV), on the basis of the embedded process model. While the model supports the controller with prediction, optimization is used to calculate the control rule by minimizing a given cost function simultaneously the satisfying constraints. MPC approach is very flexible, e.g., it makes it possible to control processes described by linear [10] or nonlinear models [11], and can incorporate on-line setpoint optimization [12] or fault-tolerant approaches [13].
The implementation of APC predictive controllers is a complex task, taking more time and materials than the startup of a univariate PID loop [14]. Such an installation is always preceded by and concluded with an assessment, which is used to justify the effort and calculate the benefits. Thus, the performance assessment of MPC is required even more than for other regulatory algorithms.
Control Performance Assessment (CPA) actually is as old as controllers. Engineers always want and need to know how good the system is and if it can be improved. Thereby, they require quantitative indexes to measure it. Dozens of different approaches and indicators have been developed during that time [9], originating from step-response measures of an overshoot and settling time, up to complex model-based or multifractal methodologies. The assessment is closely associated with and often included in an activity called a control feasibility study (or performance study), which measures the current quality and estimates potential benefits of the improved control.
MPC assessment has two faces. On the one hand, it seems to be simple. The predictive strategy embeds performance index, which can simply be used. On the other hand, it belongs to the internal controller domain and its external availability is limited. MPC industrial applications rather do not allow direct access to its interior. Furthermore, getting into MPC internal parameters often requires specific knowledge of the system. Thereby, one would expect to have an external, objective, and vendor independent methodology.
Following the above stipulations, the contribution of this paper can be clearly introduced. The prime objective of the paper is to present the available control performance quality measures and approaches, which can be effectively used to assess real Model Predictive Control applications. The techniques are systematically presented following the common classifications between model-free approaches and model-based methods that require some modeling with a priori assumptions, used to evaluate assessment metrics. The presentation of CPA methods concludes with the generalization and the synthesis of practical assessment procedure supporting control application engineers.
This paper starts with two introductory sections that bring closer two key issues: Model Predictive Control in Section 2 and CPA in Section 3. These are followed by presentation of the main contribution (Section 4), i.e., a survey of available methods, measures, and reported implementations of the predictive control assessment. The summary presents proposed procedure that helps in the execution of the MPC-CPA projects included in Section 5. The paper concludes in Section 6 with a discussion and a presentation of open research issues.

2. Model Predictive Control

Model Predictive Control [15] significantly contributes to the frequent usage of the APC in process industry. When regulatory control utilizing PID algorithm is not sufficient, there is an opportunity for a predictive control strategy. MPC history starts with Kalman research [16] on the Linear Quadratic Regulator (LQR) in the early 1960s. Consecutive research brought an introduction to the Model Predictive Heuristic Control [17] (now known as Model Algorithmic Control, MAC) and Dynamic Matrix Control (DMC) [18] in the late 1970s. Generalized Predictive Control (GPC) was introduced in the 1980s [19,20]. MPC is characterized by a fact that process mathematical model is continually used to perform a prediction of the future future and find the optimal control strategy [21]. The optimization procedure is repeated at each sampling interval.
Predictive control is renowned for its high accuracy and an ability to embed process limitations into the algorithm. However, the need to have precise model process simultaneously constitutes its main shortcoming. Such algorithms have been mainly utilized in process industry [22,23,24,25], such as paper machines, petrochemical and chemical installations, reactors, turbines, etc. Nowadays, MPC solutions are developed for nonlinear [26,27] fast embedded systems, thanks to the micro-controllers and applications such as unmanned vehicles [28], cars [29], vehicles’ anti-lock brake systems [30], active vibration suppression [31], combustion engines [32], and unmanned aerial vehicles [33].

General MPC Rule

Taking multivariate plant with n u MVs and n y Controlled Variables (CVs) as an example means that the vector of controller outputs is u = u 1 u n u T and the vector of process outputs is y = y 1 y n y T . The predictive control rule is calculated in real time increments of future MVs during each sampling interval k = 0 , 1 , in the form of a vector of length n u N u
u ( k ) = u ( k | k ) u ( k + N u 1 | k ) ,
where N u denotes the control horizon. The associated signal increments for future sampling period k + p , which are evaluated at the current moment k, are marked by u ( k + p | k ) . They are described by the following definition
u ( k + p | k ) = u ( k | k ) u ( k 1 ) for p = 0 u ( k + p | k ) u ( k + p 1 | k ) for p 1 .
General nonlinear MPC optimization problem can be formulated in a vector-matrix form
min u ( k ) y sp ( k ) y ^ ( k ) Ψ ( p ) 2 + u ( k ) Λ ( p ) 2 , subject to u min u ( k ) u max , u max u ( k ) u max , y min y ^ ( k ) y max .
The above quadratic norms are defined as x 2 = x T x and x A 2 = x T A x . Setpoint trajectory vector y sp ( k ) = y sp ( k | k ) y sp ( k + N u 1 | k ) T , the predicted trajectory vector y ^ ( k ) = y ^ ( k | k ) y ^ ( k + N u 1 | k ) T , and the vectors indicating output constraints, i.e., y min = y min y min T and y max = y max y max T , are of length N u .
Respective vectors for input constraints u min = u min u min T , u max = u max u max T , u max = u max u max T and the vector u ( k ) = u ( k | k ) u ( k + N u 1 | k ) T are of length N u , while matrices Λ ( p ) = diag ( λ , , λ ) and Ψ ( p ) = I are of dimension N u × N u .
The dependence of y ( k + p | k ) on past process inputs and outputs and on decision variables u ( k + p | k ) , p = 0 , , N u 1 is in general given by a nonlinear model.
The role of the first part of the MPC cost-function in Equation (3) is to minimize predicted control errors over the prediction horizon N. Setpoint and predicted values of the process output for future sampling interval k + p are known or calculated for a current moment k. Predicted process outputs are calculated with a mathematical model of the controlled process. The role of the second part of the performance index in Equation (3) is to eliminate excessive variations in controller outputs. Generally, the constraints may be imposed on MVs future values over the control horizon:
  • on their minimal and maximal permissible limits u min and u max ;
  • on their future changes with a limiting value of u max ; and
  • on process output predictions (also over the prediction horizon) denoted as y min and y max .
Although an entire sequence of decision variable (Equation (1)) is calculated, only the first vector element is applied to the process. During the next sampling period, k + 1 , CV measurement is updated and the procedure is repeated. The underlying MPC optimization problem may be extended, for instance taking into consideration stabilizing terminal constraint. Furthermore, additional constraints might be necessary in some specific applications, as for instance, connected with any auxiliary variables or override controls.
Quadratic cost function formulation is highly sensitive to any kind of the outliers [34]. Statistics suggests to use other estimates, such as mean absolute error, formulated as a 1 norm. An analysis of the 1 -MPC may be found in [35]. It also natural that the researchers have considered other norms, e.g., Bemporad et al. [36] explored -MPC. Another approach to improve MPC quadratic performance index was proposed by Gallieri [37], who regularized least squares with 1 component.
In all MPC algorithms, a dynamic model of the controlled process is used to predict the future values of output variable, y ^ ( k + p | k ) , over the prediction horizon, i.e., for p = N 1 , , N 2 . The receding horizon predictive control principle formulated above is presented graphically in Figure 2.
As stated above, there are dozens of versions of the predictive control rule. The difference manifests itself in the selection of:
  • process model;
  • performance index formulation;
  • utilized optimization algorithm; or
  • algorithm numerical representation.
Actually, there are no limitations with these selections. The model can be of any form, linear or nonlinear. The performance index is whichever and an optimization algorithm may be in general any nonlinear, non-gradient method unconstrained or constrained in form of the penalty functions [38]. One may use any kind of a local or global optimization approach. Such a generalization only requires repetitive calculation of the entire optimization task during each sampling period. It frequently consumes a lot of calculation resources and may end up in the violation of the sampling interval. Such an optimization is called Nonlinear Optimization MPC (NO-MPC) or repetitive control with a receding horizon. There are various simplifications of this general rule that lead toward simpler and easier to be evaluated and applied algorithms requiring much lower calculation resources.
In the general formulation, the original Model Predictive Control optimization problem in Equation (3) leads to constrained nonlinear task, which has to be evaluated repetitively in real time during each sampling period. MPC with Nonlinear Prediction and Linearization Along the Predicted Trajectory (MPC-NPLPT) [39] has been proposed to address this issue. It reduces significantly required calculations with only minor loss in an efficiency. Unlike in simple algorithms with the successive model linearization [40], only the linear approximation of future CVs trajectory prediction over prediction horizon N is evaluated during each sampling period. Linearization of MVs increments in Equation (1) over an assumed future trajectory is performed. It allows formulating a computationally efficient quadratic MPC-NPLPT problem. Only single trajectory linearization is performed, while the process is close to the required setpoint. However, when the setpoint changes or the process is affected by any disturbance, trajectory linearization is invoked and quadratic optimization is repeated a few times. Research shows that MPC-NPLPT algorithm gives satisfactory performance, similar to the general non-computationally efficient NO-MPC [41].
Despite numerous variants and versions of the MPC control rule, two main algorithmic simplifications seem to be the most popular, i.e., DMC and GPC. They are clearly simpler than NO-MPC. They use linear models which, together with a quadratic performance index, enable to analytically derive control rule. Thereby, no repetitive optimization is needed. The control rule is evaluated only once, enabling simple algorithm coding and applications. Such a simplification enables embedding DMC or GPC inside of a DCS system [42]. The main difference between the DMC and GPC lies in the type of the model used.
DMC was first presented by Cutler and Ramaker at AIChE meeting in 1979 [18] and at the Automatic Control Conference in 1980 [43]. Almost simultaneously it was applied to catalytic cracking unit and the algorithm was modified to handle nonlinearities and constraints [44]. The algorithm went through several modifications, as for instance Quadratic Dynamic Matrix Control (QDMC) [45] using quadratic programming for solving constrained open-loop optimal control task, where the system is linear, the objective function is quadratic, and the constraints are defined by linear inequalities or the numerically efficient version used in the embedded environment [46]. Discrete-time step response model of the controlled process used for prediction calculation, i.e., finding the values of y ^ m ( k + p | k ) , is the main DMC algorithm feature. Although it is limited only to stable processes. Its main advantage is the step-response process model, which may be easily obtained during practice. Since the step-response model is linear in terms of the manipulated variables, minimization of the general MPC cost-function in Equation (3) leads to computationally simple quadratic optimization task. When there are no constraints imposed on the process variables, the solution may be evaluated analytically. The unconstrained MPC optimization may be projected onto the admissible set determined by the constraints [11].
Generalized Predictive Control was introduced by Clarke in 1987 [19,20] with several further extensions (e.g., [47,48]). Regression-type discrete difference equation are used as a model in the GPC algorithm. Such model may be named as Auto-Regressive Integrated Moving Average with auXiliary Input (ARIMAX) or Controlled Auto-Regressive Integrated Moving Average (CARIMA) [11]. It is often assumed that the process is affected by an integrated white noise that simplifies the further utilized process model. It is also important to note that, according to the GPC prediction scheme, future CV predictions are simple, linear functions of the calculated decision vector and utilize free trajectory, which depends only on the past. Thereby, general MPC optimization problem can be modified accordingly. As the prediction relation is linear in terms of the obtained decision vector, the resulting optimization problem is defined as a Quadratic Programming (QP) type, which means that the cost-function is quadratic and all constraints are linear.
The above MPC configurations require a priori knowledge about the process, which must be somehow derived, for instance using the first principle modeling [49,50] or experimental empirical identification such as artificial neural networks [51,52], Hammerstein–Wiener configurations [41,53], fuzzy [54,55], and neuro-fuzzy [56,57] or Gaussian processes [58,59]. Apart from the majority of MPC configurations, which use internal process model, there are techniques that do not require explicit model definition. Model free configurations include, among others, machine learning techniques, such as regression trees [60], random forests [61], or reinforcement learning [62].
Although both GPC and DMC predictive algorithms are well established within the industry and there are many reported successful implementations over last 40 years, their practical design, tuning, and performance assessment is still a challenging task [63,64].

3. Control Performance Assessment

Industrial control systems frequently do not perform effectively [65] due to many reasons, for instance: inadequate supervision, process non-stationarity, instrumentation failures, incorrect design, feeble tuning, changing operating points, lack of engineering expertise, disturbances, noises, or human factor [66,67].
Maintenance performed by plant resources is hardly sufficient. Scientists and practitioners continually try to develop an automatic and autonomous solution that would solve the problem. CPA adventure has started with simple univariate PID-based loop assessment. The first adequate report was proposed by Åström [68] for a pulp and paper plant in 1967 using the benchmarking of process variable standard deviation. Control assessment solutions have evolved for more than 50 years in different direction, delivering to the industry mature approaches, measures, and procedures. There are many different representations of the industrial assessment process. Figure 3 shows generalized diagram of the exemplary CPA industrial utilization process.
One may find a few methods’ classifications in the literature and block tree diagrams visualizing functional similarities and differences. Figure 4 presents graphical diagram of the generalized CPA techniques classification. The industrial perspective simplifies the picture. Simplicity is the main borderline, i.e., the scope of required a priori knowledge for the utilization of a selected approach. Methods that do not require specific knowledge can be simply evaluated by delivering a clear message. In the literature, there exists basic classification, which might be applied to the MPC rule and to the quality assessment as well. Authors distinguish between data-driven and model-based approaches. There are fundamental traps in the popular interpretation of these notions. First, each method uses data. Without data there is no assessment and actually all methods are data driven. Thereby, one might distinguish between model-free and model-based approaches. From that perspective, the majority of techniques are model-based only apart from simple integral or time based. All the statistical approaches are model-based, as evaluated measures originate from some probabilistic density function, which is in fact an assumed statistical model. Thereby, the notion of a model has to be specified. Common understanding is that it is a process model. Consequently, the following classification is used throughout the paper:
  • Model-free means that no process model is required.
  • Process model-based approaches require performing the modeling of the controlled plant.
Therefore, model-free methods require only operational plant data, contrary to the process model-based approaches that always need some initial assumptions, as for instance model type or its structure.
Moreover, the preferred methodology must be robust, i.e., it has to be independent on the existing loop characteristics and statistical properties of the assessed variable. The goal is to measure internal control quality, not affected by any noises, disturbances, or possible plant influences of any origin.
Present control performance assessment research encompasses various domains and applications of control engineering. Different methods’ categories have been investigated [9,69,70]. The classification listed below includes short descriptions addressing the above discussed simplicity issues:
  • Methods requiring plant experiment:
    • measures that use setpoint step response, such as overshoot, undershoot, rise, peak and settling time, decay ratio, offset (steady state error), and peak value [71]; and
    • indexes that require disturbance step response, such as Idle Index [72], Area Index, Output Index [73], and R-index [74].
  • Model-based methods:
    • minimum variance and normalized Harris index [75], Control Performance Index [76], and other variance benchmarking methods [77];
    • all types of the model-based measures [78], derived from close loop identification, such as aggressive/oscillatory and sluggishness indexes [79];
    • frequency methods starting from classical Bode, Nyquist and Nichols charts with phase and gain margins [69] followed by deeper investigations, such as with the use of Fourier transform [80], sensitivity function [81], reference to disturbance ratio index [82], and singular spectrum analysis [83]; and
    • alternative indexes using neural networks [84] or support vector machines [85].
  • Data-driven methods:
    • integral time measures, e.g., Mean Square Error (MSE), Integral Absolute Error (IAE) [86], Integral Time Absolute Value (ITAE) [87], Integral of Square Time derivative of the Control input (ISTC) [88], Total Squared Variation (TSV) [89], and Amplitude Index (AMP) [71];
    • correlation measures, such as oscillation detection index [90] or relative damping index [91];
    • statistical factors utilizing different probabilistic distribution function (standard deviation, variance, skewness, kurtosis, scale, shape, etc.) [92], variance band index [93], or the factors of other probabilistic distributions [94,95,96];
    • benchmarking methods [97]; and
    • alternative indexes using wavelets [98], orthogonal Laguerre [99] and other functions [65], Hurst exponent [100], persistence measures [101,102], entropy [103,104,105], multifractal approaches [106], or fractional-order [107,108].
Apart from the above items, there is a group of methods utilizing hybrid or mixed approaches:
  • fusion CPA measures using sensor combination [109] or the Exponentially Weighted Moving Averages (EWMA) evaluated for other indexes [110];
  • graphic visualization and patter recognition methods [91,111,112]; and
  • case-specific business Key Performance Indicators (KPIs), e.g., number of alarms or human interventions, time in manual mode [113], and many other currency-based units [67].

4. MPC Performance Assessment

APC incorporates different approaches, mostly multivariate and nonlinear, e.g., predictive control, adaptive structures, or soft computing approaches, i.e., fuzzy logic, artificial neural networks, evolutionary computation, etc. Once process industry is taken into consideration, predictive control, often called as Model Predictive Control, constitutes the majority. It should be added that nowadays MPC approach starts to be en vague—i.e., everybody must use it.
Many different structures of the predictive control have attracted the research interest in both analytical or nonlinear optimization configurations. Actually, MPC performance assessment seems to be a simple and straightforward task. Control rule includes internal performance index, thereby its value might allow measuring controller quality naturally. Such assessment might use any MPC internal variables, such as model information, predictions, or performance index values (see Figure 5). The advantages are clear. However, such an approach requires an access to the cost-function value, which is an internal variable and is not accessible in commercial applications. Vendors of APC solutions rarely allow insight into controller internal structure, perceiving it as an intellectual property. Additionally, interpretation of such a tailored cost-function requires specific and advanced knowledge. Consecutively, the assessment should use signals external to the MPC, such as Process Variable (PV), controller output, or control error signal (see Figure 6).
There are a few surveys of the Model Predictive Control CPA methods available in the literature. Model-based and model-free approaches are presented below. The section concludes with some industrial applications references.

4.1. Model-Based Approaches

Two approaches of the model-based assessment can be distinguished. The first one uses the so-called external benchmarking approach (Figure 6). Loop signals, such as controlled or manipulated variables, are used to evaluate benchmarking model. The second approach uses internal MPC signals and models (Figure 5).
Minimum variance index allows measuring the current performance distance from the best available (optimal in minimum variance sense) control. The method uses normal process operation data to model a process and to calculate minimum variance benchmark. Thereby, a simple process model structure and the delay must be a priori known or estimated. The method calculates coefficients of the impulse response from noise-to-output transfer function with regressive models, as for instance of ARMA-type
η 0 = σ y 2 σ M V 2 ,
where σ y denotes CV variance, σ M V is the minimum achievable variance, and η 0 1 .
DMC performance assessment using Harris-type index is addressed in [114]. Further works follow a similar path with other benchmarking approaches [115,116]. Zhao et al. [117,118] proposed the LQG benchmarking to estimate achievable variability reduction through control system improvement. Ko and Edgar proposed using dynamic DMC performance bounds [119,120] of a constrained Model Predictive Control system. They developed an index based on the constrained minimum variance controller. Such a performance bound has been calculated by using the proposed moving horizon approach. It converges to the unconstrained minimum variance performance bound, and the constraints on process variables become inactive. This method requires the process model, which is utilized to evaluate the constrained minimum variance controller. Stable inverse of the process model is the additional methodology assumption.
Other consecutive methods have been proposed for the MPC benchmarking:
  • design-case approach [121], which uses the MPC controller criterion as the measure performance index J M P C ;
  • constraint benchmarking taking into account an economic performance assessment [122];
  • Harris-based benchmarking [123] applied to the multivariate cases;
  • multi-parametric quadratic programming analysis has been used to develop maps of minimum variance performance for constrained control over the state-space partition [124];
  • predictive DMC structures used to compare and assess implemented as a single controller or as a supervisory level over PID regulatory control [125];
  • orthogonal projection of the current output onto the space spanned by past outputs, inputs or setpoint using normal routine close loop data [126];
  • the infinite-horizon MPC [65];
  • Filtering and Correlation Analysis algorithm (FCOR) approach used to evaluate the minimum variance control problem and the performance assessment index [127]; and
  • many others [122,128,129].
On-line Model Predictive Control performance benchmarking and monitoring is proposed in [130,131]. Actually, various MPC structures may be used, but the research mostly focuses on DMC and GPC algorithms, as they are the most popular in practical implementations. The obtained performance index has similar formulation to the other minimum variance benchmarks
η M P C = J a J o M P C .
The second group of the model-based approaches tries to utilize the already existing and evaluated MPC internal signals or knowledge on the model and cost-function. Patwardhan and Shah [132] proposed comparing the expected value of the controller’s internal objective function against its actual value over some assumed horizon. Schafer and Cinar [133] presented hybrid monitoring and diagnosis algorithm, in which historical, actual, and design model-based performances are compared. Loquasto and Seborg [134] proposed a principal component analysis based methodology, where process data are classified into different patterns, depending on the source of suboptimal performance. Assessment uses this classification.
Agarwal et al. [135,136] considered constraints and their connection with the performance. The method uses a probabilistic approach based on the constraint analysis (probabilistic performance analysis) or the Bayesian inference framework to derive some tuning guidelines. Other methods take into the consideration the prediction error as a primary variable to be used for performance monitoring of the MPC system. Kesavan and Lee [137,138] derived two diagnostic tests using prediction error to detect quality degradation and to diagnose respective causes. Harrison and Qin [139] evaluated a method to discriminate the suboptimal performance template, between the mismatch of the process, the model, and the incorrect Kalman filter tuning. A method to detect precise location of the plant and the model mismatch was proposed by Badwe et al. [140]. It uses a statistical approach through correlation analysis of an optimal and working controller. A similar approach with the residual model analysis has been continued in [141], while Chen proposed statistical approach to detect model mismatch [142]. Pannocchia et al. [143] proposed an approach based on analysis of the prediction error, focusing on the identification presence of plant/model mismatch or incorrect disturbance modeling/augmented state estimation in offset-free MPC formulations. Consecutive approaches try to incorporate a prediction error over a given horizon, e.g., Zhao et al. [144] suggested to monitor a multi-step prediction error.
Detailed MPC embedded model analysis is suggested in [145,146]. The authors used nominal sensitivity function providing a complete diagnosis of the model, highlighting not only the effect of the model uncertainties in the corresponding system outputs, but also how a single output impacts the other variables. The use of sensitivity analysis has followed previous works on different aspect of economic and non-dynamic controller performance [147].
Recently, the whole LP-DMC problem has been taken into the consideration defining off-line underlying optimization problem [148]. The solution has been used as a benchmark for the global closed-loop behaviors study. Finally, three global indicators for evaluation and diagnosis of poor controller performance have been proposed.

4.2. Data-Driven Approaches

The data-driven approach is less frequently observed in the literature. The explanation is quite simple. As there is no model, it does not depend on the loop control algorithm. Thereby, any existing model-free approach might be used. The interest shifts towards the interpretation of the results. Consecutively, any approach using step responses (overshoot and settling time), integral indexes (MSE and IAE), signals correlation, statistical approaches (standard deviation, histogram broadness, distortion coefficients, and tail index), information entropies, persistence, and fractal measure may be equally used.
The first simple assessment approaches have utilized comparison of time trends [149], which is in fact the first thing being done in any practical assessment. Actually, it must be done. A statistical method was also developed by Zhang and Li [150], further followed by AlGhazzawi and Lennox [151], who focused on the derivation of simple and intuitive charts to support plant operators.
Chen et al. [152] proposed applying sensitivity function and integral squared error as performance evaluation criteria in the frequency and time domain respectively, to quantitatively analyze single prediction strategy.
Similar to the model-driven approaches, predictive structures have been assessed in different application configurations, i.e., as a main regulatory controller or in the supervisory level over PID control loops [153].
An interesting approach using novel, multivariate statistical technology called Slow Feature Analysis (SFA) has been proposed to separate temporal slow features from process variables. It was firstly used for diagnostics, and then extended to the MPC assessment task [154], lately followed by further modifications [155,156]. The approach enables monitoring both steady-state and dynamic responses.
Non-Gaussian statistical [157] and fractal [158] methodologies have been investigated for the GPC predictive control algorithm. Linear [159] and nonlinear [160] DMC predictive control have been assessed using integral, statistical, information, and fractal measures. This research has shown that dispersion coefficients of the non-Gaussian α -stable distribution are robust against industrial disturbances and they allow measuring control quality and to detect wrong MPC design. Similar effects can be obtained with the robust statistics approach. Robust scale estimators, such as Mean Absolute Deviation (MAD), Mean Absolute Deviation Around Median (MADAM), Least Median Square (LMS), or m-estimator using logistic ψ function deliver further alternatives. In addition, information technology brings forward the possibility to use entropy measures [160].
Furthermore, fractional-order dynamics increases possible set of robust and non-Gaussian indicators [107,108]. Xu et al. [161,162] proposed to evaluate MPC performance and capture the fluctuation of the process variables with a performance index based on Mahalanobis distance. This distance is used to construct a support vector machine classifier that allows recognizing common quality degradation schemes and determining the root cause of bad performance.

4.3. Industrial Implementations

MPC, as an important component of advanced process control, has been used in practice for more than 40 years. Any report informing about successful application of the Model Predictive Control uses some performance measures. Actually, mean square error of controlled variables is mostly used. These applications address mostly chemical and power generation industry, such as fluid catalytic cracking unit, distillation column, polyvinyl chloride plant, feed batch bioreactor or power generation efficiency, NOx emission control, and many others [26,27,64]. The specific aim of the MPC-CPA can be found, among others, in:
  • industrial validation of the multivariate MPC performance assessment at para-xylene production and poly-propylene splitter column processes in [163];
  • kerosene and naphtha hydrotreating units in [164]; and
  • model assessment performed on an industrial predictive controller applied to a propylene/propane separation system [165,166], using the methodology proposed by Botelho et al. [145], Botelho et al. [146].
Generally, MPC performance assessment is used in industry during two phases of the controller life-cycle. First, it is required during the process of decision making, whether the application of MPC is technologically and financially feasible. The need to verify the results, or to confirm whether the initially obtained profits are sustained, is the second phase. Such activities are often called the control feasibility or performance study. On-line perpetual validation of the results may be very useful. There are several commercial CPA software packages available on the market supporting on-line assessment [9,65,113].
A summary of the implementation experience collected during industrial APC and MPC applications may be found in [63,65,67]. It is interesting to notice that industrial reality of Model Predictive Control applications is not so clear. There are several issues that limit its applicability. Sustainability of the obtained benefits and the lack of experienced personnel constitute the two main limiting factors in further industrial MPC dissemination.

5. MPC Assessment Procedure

Automagic evaluation of any index without profound reflection about its properties in a given control environment leads to nowhere [167]. CPA procedure should take into account all available degrees of freedom of the MPC application and should use all available case-specific knowledge [9]. No single tool is a universal problem solver. The following control system assessment procedure has been evaluated throughout dozens of industrial projects and might play the crucial role of an initial reference plan:
(1)
Take a plant walk-down and talk to the plant personnel: operators, control and technology engineers.
(2)
Review relevant variables time trends using plant control system.
(3)
Investigate AUTO/MAN mode of operation for the considered controllers.
(4)
Collect historical data for the assessed control loops.
(5)
Calculate basic and simple data statistics, such as minimum, maximum, mean, median, standard deviation, skewness, kurtosis, MAD, etc.
(6)
If the step response is available or can be calculated, estimate the settling time and the overshoot.
(7)
Prepare static curves (MV-CV plots) to assess nonlinearities and noise ratios.
(8)
Calculate control error integral indexes: MSE and IAE, though MSE should be used with caution.
(9)
Check the stationarity of the process variables, search for possible trends, and try to remove them.
(10)
Identify potential oscillations, assess their frequency, and try to remove them.
(11)
Draw control error histogram, check its shape, validate normality tests, and look for possible fat tails.
(12)
Fit underlying distributions, select the best fitting function, and estimate its coefficients with the aim to identify an underlying generation mechanism.
(a)
If signals are Gaussian, normal standard deviation and other moments may be used.
(b)
Once fat tails exist, α -stable distribution seems to be a reliable choice with its coefficients: scaling γ , skewness β , or characteristic exponent α .
(c)
Calculate robust scale estimators σ r o b .
(d)
Otherwise, select coefficients for the another best fitting PDF.
(13)
In case of fat tails, data non-stationarity, or self-similarity, conduct the persistence analysis using rescaled range R/S and estimate Hurst exponents and crossover points.
(14)
Translate obtained numbers into verbal conclusions.
(15)
Suggest relevant improvement actions.
The procedure presented above uses model-free measures and does not require any modeling or questionable assumptions about the considered process. One has to remember that plant unique features demand engineering flexibility and creativity. Industrial MPC performance assessment is an art. The CPA process cannot be fully dehumanized, however any supporting decision-making software is helpful, if only available.

6. Discussion and Further Research

The paper presents a survey on control performance assessment methodologies that can be used to evaluate quality of Model Predictive Control. The paper consists of two main parts. A short summary on MPC technologies is followed by the main part—the review of the assessment techniques. At first, the general summary of the research on CPA is presented. This description introduces the reader to the main section, i.e., the survey on approaches applied to measure MPC quality.
The methods are divided into two groups: model-based and model-free. Approaches that require modeling always need some a priori knowledge. Therefore, the question arises of whether the model is wrong or the performance. Model-based methods inherit the same limitations as the underlying modeling methodology exhibits. Non-stationarity, non-Gaussian system properties, nonlinearity and correlated disturbances do not help. It is even worse, because they bias the estimates and shadow real performance. On the contrary, model-free methods are more universal; however, these methods do not provide the knowledge on how far the assessed system is from the best achievable performance. The assessment becomes a game of compromises.
The paper concludes with the proposed assessment procedure. It is an open suggestion. Each application example exhibits its own specific properties that must always be considered. The assessment engineer needs to have open eyes. He cannot be tied down with any habit and method. Control performance assessment, CPA-MPC in particular, requires rationality, awareness, open eyes, and independence of opinion.

Funding

This research received no external funding.

Conflicts of Interest

The author declares no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
CPAControl Performance Assessment
PDFProbabilistic Density Function
MPCModel Predictive Control
MIMOMulti Input Multi Output
SISOSingle Input Single Output
PIDProportional, Integral and Derivative
LQRLinear, Quadratic Regulator
DMCDynamic Matrix Control
LP-DMCLinear Programming Dynamic Matrix Control
QDMCQuadratic Dynamic Matrix Control
GPCGeneralized Predictive Control
MACModel Algorithmic Control
MVManipulated Variable
CVControlled Variable
DVDisturbance Variable
PVProcess Variable
ARMAAuto-Regressive Moving Average
ARIMAXAuto-Regressive Integrated Moving Average with auXiliary Input
CARIMAControlled Auto-Regressive Integrated Moving Average
NO-MPCNonlinear Optimization Model Predictive Control
MPC-NPLPTMPC with Nonlinear Prediction and Linearization Along the Predicted Trajectory
MSEMean Square Error
IAEIntegral Absolute Error
ITAEIntegral Time Absolute Value
ISTCIntegral of Square Time derivative of the Control input
TSVTotal Squared Variation
AMPAmplitude Index
LQGLinear Quadratic Gaussian
PCAPrincipal Component Analysis
FCORFiltering and CORrelation analysis
KPIKey Performance Indicator
EWMAexponentially weighted moving averages
SVMsupport vector machine
MADMean Absolute Deviation
MADAMMean Absolute Deviation Around Median
SFASlow Feature Analysis

References

  1. Knospe, C. PID control. IEEE Control Syst. Mag. 2006, 26, 30–31. [Google Scholar]
  2. Åström, K.J.; Murray, R. Feedback Systems: An Introduction for Scientists and Engineers; Princeton University Press: Princeton, NJ, USA; Oxford, UK, 2012; Available online: http://www.cds.caltech.edu/~murray/amwiki (accessed on 25 October 2019).
  3. Samad, T. A Survey on Industry Impact and Challenges Thereof [Technical Activities]. IEEE Control Syst. Mag. 2017, 37, 17–18. [Google Scholar]
  4. Soltesz, K. On Automation of the PID Tuning Procedure. Ph.D. Thesis, Department of Automatic Control, Lund University, Lund, Sweden, 2012. [Google Scholar]
  5. Leiviskä, K. Industrial Applications of Soft Computing: Paper, Mineral and Metal Processing Industries; Springer: Berlin/Heidelberg, Germany, 2001. [Google Scholar]
  6. Betlem, B.; Roffel, B. Advanced Practical Process Control; Advances in Soft Computing; Springer: Berlin/Heidelberg, Germany, 2003. [Google Scholar]
  7. Ordys, A.; Uduehi, D.; Johnson, M.A. Process Control Performance Assessment—From Theory to Implementation; Springer: London, UK, 2007. [Google Scholar]
  8. Ribeiro, R.N.; Muniz, E.S.; Metha, R.; Park, S.W. Economic evaluation of advanced process control projects. Rev. O Pap. 2013, 74, 57–65. [Google Scholar]
  9. Domański, P.D. Control Performance Assessment: Theoretical Analyses and Industrial Practice; Springer International Publishing: Cham, Switzerland, 2020. [Google Scholar]
  10. Tatjewski, P. Disturbance modeling and state estimation for offset-free predictive control with state-space process models. Int. J. Appl. Math. Comput. Sci. 2014, 24, 313–323. [Google Scholar] [CrossRef][Green Version]
  11. Tatjewski, P. Advanced Control of Industrial Processes, Structures and Algorithms; Springer: London, UK, 2007. [Google Scholar]
  12. Tatjewski, P. Supervisory predictive control and on-line set-point optimization. Int. J. Appl. Math. Comput. Sci. 2010, 20, 483–495. [Google Scholar] [CrossRef][Green Version]
  13. Yang, X.; Maciejowski, J.M. Fault tolerant control using Gaussian processes and model predictive control. Int. J. Appl. Math. Comput. Sci. 2015, 25, 133–148. [Google Scholar] [CrossRef][Green Version]
  14. Domański, P.D. Optimization Projects in Industry—Much ado about nothing. In Proceedings of the VIII National Conference on Evolutionary Algorithms and Global Optimization KAEiOG, Korbielów, Poland; Warsaw University of Technology Press: Warsaw, Poland, 2005; pp. 45–54. [Google Scholar]
  15. Maciejowski, J.M. Predictive Control with Constraints; Prentice Hall: Harlow, UK, 2002. [Google Scholar]
  16. Kalman, R.E. Contribution to the theory of optimal control. Bol. Soc. Math. Mex. 1960, 5, 102–119. [Google Scholar]
  17. Richalet, J.; Rault, A.; Testud, J.; Papon, J. Model algorithmic control of industrial processes. IFAC Proc. Vol. 1977, 10, 103–120. [Google Scholar] [CrossRef]
  18. Cutler, C.R.; Ramaker, B. Dynamic matrix control—A computer control algorithm. In Proceedings of the AIChE National Meeting, Houston, TX, USA, 1–5 April 1979. [Google Scholar]
  19. Clarke, W.; Mohtadi, C.; Tuffs, P.S. Generalized predictive control—I. The basic algorithm. Automatica 1987, 23, 137–148. [Google Scholar] [CrossRef]
  20. Clarke, W.; Mohtadi, C.; Tuffs, P.S. Generalized predictive control—II. Extensions and interpretations. Automatica 1987, 23, 149–160. [Google Scholar] [CrossRef]
  21. Camacho, E.F.; Bordons, C. Model Predictive Control; Springer: London, UK, 1999. [Google Scholar]
  22. Lee, J.H. Model predictive control: Review of the three decades of development. Int. J. Control. Autom. Syst. 2011, 9, 415. [Google Scholar] [CrossRef]
  23. Forbes, M.G.; Patwardhan, R.S.; Hamadah, H.; Gopaluni, R.B. Model Predictive Control in Industry: Challenges and Opportunities. IFAC-PapersOnLine 2015, 48, 531–538. [Google Scholar] [CrossRef]
  24. Zanoli, S.M.; Cocchioni, F.; Pepe, C. MPC-based energy efficiency improvement in a pusher type billets reheating furnace. Adv. Sci. Technol. Eng. Syst. J. 2018, 3, 74–84. [Google Scholar] [CrossRef][Green Version]
  25. Stentoft, P.; Munk-Nielsen, T.; Møller, J.; Madsen, H.; Vezzaro, L.; Mikkelsen, P.; Vangsgaard, A. Green MPC—An approach towards predictive control for minimimal environmental impact of activated sludge processes. In Proceedings of the 10th IWA Symposium on Modelling and Integrated Assessment, Copenhagen, Denmark, 1–4 September 2019. [Google Scholar]
  26. Cychowski, M. Explicit Nonlinear Model Predictive Control, Theory and Applications; VDM Verlag Dr. Mueller: Berlin/Heidelberg, Germany, 2009. [Google Scholar]
  27. Khaled, N.; Pattel, B. Practical Design and Application of Model Predictive Control; Butterworth-Heinemann: Cambridge, MA, USA, 2018. [Google Scholar]
  28. Guo, J.; Luo, Y.; Li, K. Adaptive neural-network sliding mode cascade architecture of longitudinal tracking control for unmanned vehicles. Nonlinear Dyn. 2017, 87, 2497–2510. [Google Scholar] [CrossRef]
  29. Sawulski, J.; Ławryńczuk, M. Optimization of control strategy for a low fuel consumption vehicle engine. Inf. Sci. 2019, 493, 192–216. [Google Scholar] [CrossRef]
  30. Sardarmehni, T.; Rahmani, R.; Menhaj, M.B. Robust control of wheel slip in anti-lock brake system of automobiles. Nonlinear Dyn. 2014, 76, 125–138. [Google Scholar] [CrossRef]
  31. Takács, G.; Batista, G.; Gulan, M.; Rohal’-Ilkiv, B. Embedded explicit model predictive vibration control. Mechatronics 2016, 36, 54–62. [Google Scholar]
  32. Xu, F.; Chen, H.; Gong, X. Fast nonlinear Model Predictive Control on FPGA using particle swarm optimization. IEEE Trans. Ind. Electron. 2016, 63, 310–321. [Google Scholar] [CrossRef]
  33. Zhu, B. Nonlinear adaptive neural network control for a model-scaled unmanned helicopter. Nonlinear Dyn. 2014, 78, 1695–1708. [Google Scholar] [CrossRef][Green Version]
  34. Rousseeuw, P.J.; Leroy, A.M. Robust Regression and Outlier Detection; John Wiley & Sons, Inc.: New York, NY, USA, 1987. [Google Scholar]
  35. Dötlinger, A.; Kennel, R.M. Near time-optimal model predictive control using an L1-norm based cost functional. In Proceedings of the 2014 IEEE Energy Conversion Congress and Exposition (ECCE), Pittsburgh, PA, USA, 14–18 September 2014; pp. 3504–3511. [Google Scholar]
  36. Bemporad, A.; Borrelli, F.; Morari, M. Model predictive control based on linear programming—The explicit solution. IEEE Trans. Autom. Control 2002, 47, 1974–1985. [Google Scholar] [CrossRef]
  37. Gallieri, M. asso-MPC—Predictive Control with ℓ1-Regularised Least Squares; Springer Theses, Springer International Publishing: Cham, Switzerland, 2016. [Google Scholar]
  38. Arabas, J.; Białobrzeski, L.; Chomiak, T.; Domański, P.D.; Świrski, K.; Neelakantan, R. Pulverized Coal Fired Boiler Optimization and NOx Control using Neural Networks and Fuzzy Logic. In Proceedings of the AspenWorld’97, Boston, MA, USA, 12–16 October 1997. [Google Scholar]
  39. Ławryńczuk, M. Practical nonlinear predictive control algorithms for neural Wiener models. J. Process Control 2013, 23, 696–714. [Google Scholar] [CrossRef]
  40. Ławryńczuk, M. Computationally Efficient Model Predictive Control Algorithms: A Neural Network Approach; Studies in Systems, Decision and Control; Springer: Cham, Switzerland, 2014; Volume 3. [Google Scholar]
  41. Ławryńczuk, M. Nonlinear State–Space Predictive Control with On–Line Linearisation and State Estimation. Int. J. Appl. Math. Comput. Sci. 2015, 25, 833–847. [Google Scholar] [CrossRef][Green Version]
  42. Lahiri, S.K. Multivariable Predictive Control: Applications in Industry; John Wiley & Sons, Ltd.: Hoboken, NJ, USA, 2017. [Google Scholar]
  43. Cutler, C.R.; Ramaker, B.L. Dynamic matrix control—A computer control algorithm. Jt. Autom. Control Conf. 1980, 17, 72. [Google Scholar]
  44. Prett, D.M.; Gillette, R.D. Optimization and constrained multivariable control of a catalytic cracking unit. Jt. Autom. Control Conf. 1980, 17, 73. [Google Scholar]
  45. Cutler, C.R.; Haydel, J.J.; Moshedi, A.M. An Industrial Perspective on Advanced Control. In Proceedings of the AIChE Diamond Jubilee Meeting, Washington, DC, USA, 4 November 1983. [Google Scholar]
  46. Plamowski, S. Implementation of DMC algorithm in embedded controller—Resources, memory and numerical modifications. In Trends in Advanced Intelligent Control, Optimization and Automation; Mitkowski, W., Kacprzyk, J., Oprzędkiewicz, K., Skruch, P., Eds.; Springer International Publishing: Cham, Switzerland, 2017; pp. 335–343. [Google Scholar]
  47. Xu, J.; Pan, X.; Li, Y.; Wang, G.; Martinez, R. An improved generalized predictive control algorithm based on the difference equation CARIMA model for the SISO system with known strong interference. J. Differ. Equ. Appl. 2019, 25, 1255–1269. [Google Scholar] [CrossRef]
  48. Solís-Chaves, J.S.; Rodrigues, L.L.; Rocha-Osorio, C.M.; Sguarezi Filho, A.J. A Long-Range Generalized Predictive Control Algorithm for a DFIG Based Wind Energy System. IEEE/CAA J. Autom. Sin. 2019, 6, 1209. [Google Scholar] [CrossRef]
  49. Rodríguez, M.; Pérez, D. First principles model based control. In European Symposium on Computer-Aided Process Engineering-15, 38th European Symposium of the Working Party on Computer Aided Process Engineering; Puigjaner, L., Espuña, A., Eds.; Computer Aided Chemical Engineering; Elsevier: Amsterdam, The Netherlands, 2005; Volume 20, pp. 1285–1290. [Google Scholar]
  50. Zhang, Z.; Wu, Z.; Rincon, D.; Christofides, P.D. Real-Time Optimization and Control of Nonlinear Processes Using Machine Learning. Mathematics 2019, 7, 890. [Google Scholar] [CrossRef][Green Version]
  51. Gabor, J.; Pakulski, D.; Domański, P.D.; Świrski, K. Closed loop NOx control and optimization using neural networks. In Proceedings of the IFAC Symposium on Power Plants and Power Systems Control, Brussels, Belgium, 26–29 April 2000; pp. 188–196. [Google Scholar]
  52. Afram, A.; Janabi-Sharifi, F.; Fung, A.S.; Raahemifar, K. Artificial neural network (ANN) based model predictive control (MPC) and optimization of HVAC systems: A state of the art review and case study of a residential HVAC system. Energy Build. 2017, 141, 96–113. [Google Scholar] [CrossRef]
  53. Patwardhan, R.S.; Lakshminarayanan, S.; Shah, S.L. Constrained nonlinear MPC using hammerstein and wiener models: PLS framework. AIChE J. 1998, 44, 1611–1622. [Google Scholar] [CrossRef]
  54. Espinosa, J.J.; Vandewalle, J. Predictive Control Using Fuzzy Models. In Advances in Soft Computing; Roy, R., Furuhashi, T., Chawdhry, P.K., Eds.; Springer: London, UK, 1999; pp. 187–200. [Google Scholar]
  55. Huang, Y.; Lou, H.H.; Gong, J.P.; Edgar, T.F. Fuzzy model predictive control. IEEE Trans. Fuzzy Syst. 2000, 8, 665–678. [Google Scholar]
  56. Schultz, P.; Golenia, Z.; Grott, J.; Domański, P.D.; Świrski, K. Advanced Emission Control. In Proceedings of the Power GEN Europe 2000 Conference, Helsinki, Finland, 20–22 June 2000; pp. 171–178. [Google Scholar]
  57. Subathra, B.; Seshadhri, S.; Radhakrishnan, T. A comparative study of neuro fuzzy and recurrent neuro fuzzy model-based controllers for real-time industrial processes. Syst. Sci. Control Eng. 2015, 3, 412–426. [Google Scholar] [CrossRef]
  58. Jain, A.; Nghiem, T.; Morari, M.; Mangharam, R. Learning and Control Using Gaussian Processes. In Proceedings of the 2018 ACM/IEEE 9th International Conference on Cyber-Physical Systems (ICCPS), Porto, Portugal, 11–13 April 2018; pp. 140–149. [Google Scholar]
  59. Hewing, L.; Kabzan, J.; Zeilinger, M.N. Cautious Model Predictive Control Using Gaussian Process Regression. IEEE Trans. Control Syst. Technol. 2019, 1–8. [Google Scholar] [CrossRef][Green Version]
  60. Jain, A.; Smarra, F.; Behl, M.; Mangharam, R. Data-Driven Model Predictive Control with Regression Trees—An Application to Building Energy Management. ACM Trans. Cyber-Phys. Syst. 2018, 2, 1–21. [Google Scholar] [CrossRef]
  61. Smarra, F.; Di Girolamo, G.D.; De Iuliis, V.; Jain, A.; Mangharam, R.; D’Innocenzo, A. Data-driven switching modeling for MPC using Regression Trees and Random Forests. Nonlinear Anal. Hybrid Syst. 2020, 36, 100882. [Google Scholar] [CrossRef]
  62. Ernst, D.; Glavic, M.; Capitanescu, F.; Wehenkel, L. Reinforcement Learning Versus Model Predictive Control: A Comparison on a Power System Problem. IEEE Trans. Syst. Man, Cybern. Part B (Cybern.) 2009, 39, 517–529. [Google Scholar] [CrossRef] [PubMed]
  63. Smuts, J.F.; Hussey, A. Requirements for Successfully Implementing and Sustaining Advanced Control Applications. In Proceedings of the 54th ISA POWID Symposium, Charlotte, NC, USA, 6–8 June 2011; pp. 89–105. [Google Scholar]
  64. Domański, P.D.; Leppakoski, J. Advanced Process Control Implementation of Boiler Optimization driven by Future Power Market Challenges. In Proceedings of the Pennwell Conference Coal-GEN Europe 2012, Warsaw, Poland, 14–16 February 2012. [Google Scholar]
  65. Jelali, M. Control Performance Management in Industrial Automation: Assessment, Diagnosis and Improvement of Control Loop Performance; Springer: London, UK, 2013. [Google Scholar]
  66. Starr, K.D.; Petersen, H.; Bauer, M. Control loop performance monitoring—ABB’s experience over two decades. IFAC-PapersOnLine 2016, 49, 526–532. [Google Scholar] [CrossRef]
  67. Bauer, M.; Horch, A.; Xie, L.; Jelali, M.; Thornhill, N. The current state of control loop performance monitoring—A survey of application in industry. J. Process Control 2016, 38, 1–10. [Google Scholar] [CrossRef]
  68. Åström, K.J. Computer control of a paper machine—An application of linear stochastic control theory. IBM J. 1967, 11, 389–405. [Google Scholar] [CrossRef]
  69. Shardt, Y.; Zhao, Y.; Qi, F.; Lee, K.; Yu, X.; Huang, B.; Shah, S. Determining the state of a process control system: Current trends and future challenges. Can. J. Chem. Eng. 2012, 90, 217–245. [Google Scholar] [CrossRef]
  70. O’Neill, Z.; Li, Y.; Williams, K. HVAC control loop performance assessment: A critical review (1587-RP). Sci. Technol. Built Environ. 2017, 23, 619–636. [Google Scholar] [CrossRef][Green Version]
  71. Spinner, T.; Srinivasan, B.; Rengaswamy, R. Data-based automated diagnosis and iterative retuning of proportional-integral (PI) controllers. Control Eng. Pract. 2014, 29, 23–41. [Google Scholar]
  72. Hägglund, T. Automatic detection of sluggish control loops. Control Eng. Pract. 1999, 7, 1505–1511. [Google Scholar]
  73. Visioli, A. Method for Proportional-Integral Controller Tuning Assessment. Ind. Eng. Chem. Res. 2006, 45, 2741–2747. [Google Scholar]
  74. Salsbury, T.I. A practical method for assessing the performance of control loops subject to random load changes. J. Process Control 2005, 15, 393–405. [Google Scholar]
  75. Harris, T.J. Assessment of closed loop performance. Can. J. Chem. Eng. 1989, 67, 856–861. [Google Scholar]
  76. Grimble, M.J. Controller performance benchmarking and tuning using generalised minimum variance control. Automatica 2002, 38, 2111–2119. [Google Scholar]
  77. Harris, T.J.; Seppala, C.T. Recent Developments in Controller Performance Monitoring and Assessment Techniques. In Proceedings of the Sixth International Conference on Chemical Process Control, Tucson, AZ, USA, 7–12 January 2001; pp. 199–207. [Google Scholar]
  78. Meng, Q.W.; Gu, J.Q.; Zhong, Z.F.; Ch, S.; Niu, Y.G. Control performance assessment and improvement with a new performance index. In Proceedings of the 2013 25th Chinese Control and Decision Conference (CCDC), Guiyang, China, 25–27 May 2013; pp. 4081–4084. [Google Scholar]
  79. Salsbury, T.I. Continuous-time model identification for closed loop control performance assessment. Control Eng. Pract. 2007, 15, 109–121. [Google Scholar]
  80. Schlegel, M.; Skarda, R.; Cech, M. Running discrete Fourier transform and its applications in control loop performance assessment. In Proceedings of the 2013 International Conference on Process Control (PC), Strbske Pleso, Slovakia, 18–21 June 2013; pp. 113–118. [Google Scholar]
  81. Tepljakov, A.; Petlenkov, E.; Belikov, J. A flexible MATLAB tool for optimal fractional-order PID controller design subject to specifications. In Proceedings of the 2012 31st Chinese Control Conference (CCC), Hefei, China, 25–27 July 2012. [Google Scholar]
  82. Alagoz, B.B.; Tan, N.; Deniz, F.N.; Keles, C. Implicit disturbance rejection performance analysis of closed loop control systems according to communication channel limitations. IET Control Theory Appl. 2015, 9, 2522–2531. [Google Scholar]
  83. Yuan, H. Process Analysis and Performance Assessment for Sheet Forming Processes. Ph.D. Thesis, Queen’s University, Kingston, ON, Canada, 2015. [Google Scholar]
  84. Zhou, Y.; Wan, F. A neural network approach to control performance assessment. Int. J. Intell. Comput. Cybern. 2008, 1, 617–633. [Google Scholar]
  85. Pillay, N.; Govender, P. Multi-Class SVMs for Automatic Performance Classification of Closed Loop Controllers. J. Control Eng. Appl. Inform. 2017, 19, 3–12. [Google Scholar]
  86. Shinskey, F.G. How Good are Our Controllers in Absolute Performance and Robustness? Meas. Control 1990, 23, 114–121. [Google Scholar] [CrossRef]
  87. Zhao, Y.; Xie, W.; Tu, X. Performance-based parameter tuning method of model-driven PID control systems. ISA Trans. 2012, 51, 393–399. [Google Scholar] [CrossRef] [PubMed]
  88. Zheng, B. Analysis and Auto-Tuning of Supply Air Temperature PI Control in Hot Water Heating Systems. Ph.D. Thesis, University of Nebraska, Lincoln, NE, USA, 2007. [Google Scholar]
  89. Yu, Z.; Wang, J. Performance assessment of static lead-lag feedforward controllers for disturbance rejection in PID control loops. ISA Trans. 2016, 64, 67–76. [Google Scholar] [CrossRef] [PubMed]
  90. Horch, A. A simple method for detection of stiction in control valves. Control Eng. Pract. 1999, 7, 1221–1231. [Google Scholar] [CrossRef]
  91. Howard, R.; Cooper, D. A novel pattern-based approach for diagnostic controller performance monitoring. Control Eng. Pract. 2010, 18, 279–288. [Google Scholar] [CrossRef]
  92. Choudhury, M.A.A.S.; Shah, S.L.; Thornhill, N.F. Diagnosis of poor control-loop performance using higher-order statistics. Automatica 2004, 40, 1719–1728. [Google Scholar] [CrossRef][Green Version]
  93. Li, Y.; O’Neill, Z. Evaluating control performance on building HVAC controllers. In Proceedings of the BS2015—14th Conference ofInternational Building Performance Simulation Association, Hyderabad, India, 7–9 December 2015; pp. 962–967. [Google Scholar]
  94. Zhong, L. Defect distribution model validation and effective process control. Proc. SPIE 2003, 5041, 31–38. [Google Scholar]
  95. Domański, P.D. Non-Gaussian Statistical Measures of Control Performance. Control Cybern. 2017, 46, 259–290. [Google Scholar]
  96. Domański, P.D.; Golonka, S.; Marusak, P.M.; Moszowski, B. Robust and Asymmetric Assessment of the Benefits from Improved Control—Industrial Validation. IFAC-PapersOnLine 2018, 51, 815–820. [Google Scholar] [CrossRef]
  97. Hadjiiski, M.; Georgiev, Z. Benchmarking of Process Control Performance. In Problems of Engineering, Cybernetics and Robotics; Bulgarian Academy of Sciences: Sofia, Bulgaria, 2005; Volume 55, pp. 103–110. [Google Scholar]
  98. Nesic, Z.; Dumont, G.; Davies, M.; Brewster, D. CD Control Diagnostics Using a Wavelet Toolbox. In Proceedings of the CD Symposium, IMEKO, Tampere, Finland, 1–6 June 1997; Volume XB, pp. 120–125. [Google Scholar]
  99. Lynch, C.B.; Dumont, G.A. Control loop performance monitoring. IEEE Trans. Control Syst. Technol. 1996, 4, 185–192. [Google Scholar] [CrossRef]
  100. Pillay, N.; Govender, P. A Data Driven Approach to Performance Assessment of PID Controllers for Setpoint Tracking. Procedia Eng. 2014, 69, 1130–1137. [Google Scholar] [CrossRef][Green Version]
  101. Domański, P.D. Non-Gaussian and persistence measures for control loop quality assessment. Chaos Interdiscip. J. Nonlinear Sci. 2016, 26, 043105. [Google Scholar] [CrossRef] [PubMed]
  102. Domański, P.D. Control quality assessment using fractal persistence measures. ISA Trans. 2019, 90, 226–234. [Google Scholar] [CrossRef] [PubMed]
  103. Zhang, J.; Jiang, M.; Chen, J. Minimum entropy-based performance assessment of feedback control loops subjected to non-Gaussian disturbances. J. Process Control 2015, 24, 1660–1670. [Google Scholar] [CrossRef]
  104. Zhou, J.; Jia, Y.; Jiang, H.; Fan, S. Non-Gaussian Systems Control Performance Assessment Based on Rational Entropy. Entropy 2018, 20, 331. [Google Scholar] [CrossRef][Green Version]
  105. Zhang, Q.; Wang, Y.; Lee, F.; Chen, Q.; Sun, Z. Improved Renyi Entropy Benchmark for Performance Assessment of Common Cascade Control System. IEEE Access 2019, 7, 6796–6803. [Google Scholar] [CrossRef]
  106. Domański, P.D.; Gintrowski, M. Alternative approaches to the prediction of electricity prices. Int. J. Energy Sect. Manag. 2017, 11, 3–27. [Google Scholar] [CrossRef]
  107. Liu, K.; Chen, Y.Q.; Domański, P.D.; Zhang, X. A Novel Method for Control Performance Assessment with Fractional Order Signal Processing and Its Application to Semiconductor Manufacturing. Algorithms 2018, 11, 90. [Google Scholar] [CrossRef][Green Version]
  108. Liu, K.; Chen, Y.; Domański, P.D. Control Performance Assessment of the Disturbance with Fractional Order Dynamics. In Nonlinear Dynamics and Control; Lacarbonara, W., Balachandran, B., Ma, J., Tenreiro Machado, J.A., Stepan, G., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 255–264. [Google Scholar]
  109. Khamseh, S.A.; Sedigh, A.K.; Moshiri, B.; Fatehi, A. Control performance assessment based on sensor fusion techniques. Control Eng. Pract. 2016, 49, 14–28. [Google Scholar] [CrossRef]
  110. Salsbury, T.I.; Alcala, C.F. Two new normalized EWMA-based indices for control loop performance assessment. In Proceedings of the 2015 American Control Conference (ACC), Chicago, IL, USA, 1–3 July 2015; pp. 962–967. [Google Scholar]
  111. Dziuba, K.; Góra, R.; Domański, P.D.; Ławryńczuk, M. Multicriteria control quality assessment for ammonia production process (in Polish). In 1st Scientific and Technical Conference Innovations in the Chemical Industry; Zalewska, A., Ed.; Polish Chamber of Chemical Industry: Warsaw, Poland, 2018; pp. 80–90. [Google Scholar]
  112. Domański, P.D.; Ławryńczuk, M.; Golonka, S.; Moszowski, B.; Matyja, P. Multi-criteria Loop Quality Assessment: A Large-Scale Industrial Case Study. In Proceedings of the IEEE International Conference on Methods and Models in Automation and Robotics MMAR, Międzyzdroje, Poland, 26–29 August 2019; pp. 99–104. [Google Scholar]
  113. Knierim-Dietz, N.; Hanel, L.; Lehner, J. Definition and Verification of the Control Loop Performance for Different Power Plant Types; Technical Report; Institute of Combustion and Power Plant Technology, University of Stutgart: Stutgart, Germany, 2012. [Google Scholar]
  114. Hugo, A. Performance assessment of DMC controllers. In Proceedings of the 1999 American Control Conference, San Diego, CA, USA, 2–4 June 1999; Volume 4, pp. 2640–2641. [Google Scholar]
  115. Julien, R.H.; Foley, M.W.; Cluett, W.R. Performance assessment using a model predictive control benchmark. J. Process Control 2004, 14, 441–456. [Google Scholar] [CrossRef]
  116. Sotomayor, O.A.Z.; Odloak, D. Performance Assessment of Model Predictive Control Systems. IFAC Proc. Vol. 2006, 39, 875–880. [Google Scholar] [CrossRef]
  117. Zhao, C.; Zhao, Y.; Su, H.; Huang, B. Economic performance assessment of advanced process control with LQG benchmarking. J. Process Control 2009, 19, 557–569. [Google Scholar] [CrossRef]
  118. Zhao, C.; Su, H.; Gu, Y.; Chu, J. A Pragmatic Approach for Assessing the Economic Performance of Model Predictive Control Systems and Its Industrial Application. Chin. J. Chem. Eng. 2009, 17, 241–250. [Google Scholar] [CrossRef]
  119. Ko, B.S.; Edgar, T.F. Performance assessment of multivariable feedback control systems. In Proceedings of the 2000 American Control Conference, Chicago, IL, USA, USA, 28–30 June 2000; Volume 6, pp. 4373–4377. [Google Scholar]
  120. Ko, B.S.; Edgar, T.F. Performance assessment of constrained model predictive control systems. AIChE J. 2001, 47, 1363–1371. [Google Scholar] [CrossRef]
  121. Shah, S.L.; Patwardhan, R.; Huang, B. Multivariate Controller Performance Analysis: Methods, Applications and Challenges. In Proceedings of the Sixth International Conference on Chemical Process Control, Tucson, AZ, USA, 7–12 January 2001; pp. 199–207. [Google Scholar]
  122. Xu, F.; Huang, B.; Tamayo, E.C. Assessment of Economic Performance of Model Predictive Control through Variance/Constraint Tuning. IFAC Proc. Vol. 2006, 39, 899–904. [Google Scholar] [CrossRef][Green Version]
  123. Yuan, Q.; Lennox, B. The Investigation of Multivariable Control Performance Assessment Techniques. In Proceedings of the UKACC International Conference on Control 2008, Cardiff, UK, 3–5 September 2008. [Google Scholar]
  124. Harrison, C.A.; Qin, S.J. Minimum variance performance map for constrained model predictive control. J. Process Control 2009, 19, 1199–1204. [Google Scholar] [CrossRef]
  125. Pour, N.D.; Huang, B.; Shah, S. Performance assessment of advanced supervisory-regulatory control systems with subspace LQG benchmark. Automatica 2010, 46, 1363–1368. [Google Scholar] [CrossRef]
  126. Sun, Z.; Qin, S.J.; Singhal, A.; Megan, L. Control performance monitoring via model residual assessment. In Proceedings of the 2012 American Control Conference (ACC), Montréal, QC, Canada, 27–29 June 2012; pp. 2800–2805. [Google Scholar]
  127. Borrero-Salazar, A.A.; Cardenas-Cabrera, J.; Barros-Gutierrez, D.A.; Jiménez-Cabas, J. A comparison study of MPC strategies based on minimum variance control index performance. Rev. ESPACIOS 2019, 40, 12–38. [Google Scholar]
  128. Xu, F.; Huang, B.; Akande, S. Performance Assessment of Model Pedictive Control for Variability and Constraint Tuning. Ind. Eng. Chem. Res. 2007, 46, 1208–1219. [Google Scholar] [CrossRef]
  129. Wei, W.; Zhuo, H. Research of performance assessment and monitoring for multivariate model predictive control system. In Proceedings of the 2009 4th International Conference on Computer Science Education, Nanning, China, 25–28 July 2009; pp. 509–514. [Google Scholar]
  130. Huang, B.; Shah, S.L.; Kwok, E.K. On-line control performance monitoring of MIMO processes. In Proceedings of the 1995 American Control Conference, Seattle, WA, USA, 21–23 June 1995; Volume 2, pp. 1250–1254. [Google Scholar]
  131. Zhang, R.; Zhang, Q. Model predictive control performance assessment using a prediction error benchmark. In Proceedings of the 2011 International Symposium on Advanced Control of Industrial Processes (ADCONIP), Hangzhou, China, 23–26 May 2011; pp. 571–574. [Google Scholar]
  132. Patwardhan, R.S.; Shah, S.L. Issues in performance diagnostics of model-based controllers. J. Process Control 2002, 12, 413–427. [Google Scholar] [CrossRef]
  133. Schäfer, J.; Cinar, A. Multivariable MPC system performance assessment, monitoring, and diagnosis. J. Process Control 2004, 14, 113–129. [Google Scholar] [CrossRef]
  134. Loquasto, F.; Seborg, D.E. Monitoring Model Predictive Control Systems Using Pattern Classification and Neural Networks. Ind. Eng. Chem. Res. 2003, 42, 4689–4701. [Google Scholar] [CrossRef]
  135. Agarwal, N.; Huang, B.; Tamayo, E.C. Assessing Model Prediction Control (MPC) Performance. 1. Probabilistic Approach for Constraint Analysis. Ind. Eng. Chem. Res. 2007, 46, 8101–8111. [Google Scholar] [CrossRef]
  136. Agarwal, N.; Huang, B.; Tamayo, E.C. Assessing Model Prediction Control (MPC) Performance. 2. Bayesian Approach for Constraint Tuning. Ind. Eng. Chem. Res. 2007, 46, 8112–8119. [Google Scholar] [CrossRef]
  137. Kesavan, P.; Lee, J.H. Diagnostic Tools for Multivariable Model-Based Control Systems. Ind. Eng. Chem. Res. 1997, 36, 2725–2738. [Google Scholar] [CrossRef]
  138. Kesavan, P.; Lee, J.H. A set based approach to detection and isolation of faults in multivariable systems. Comput. Chem. Eng. 2001, 25, 925–940. [Google Scholar] [CrossRef]
  139. Harrison, C.A.; Qin, S.J. Discriminating between disturbance and process model mismatch in model predictive control. J. Process Control 2009, 19, 1610–1616. [Google Scholar] [CrossRef]
  140. Badwe, A.S.; Gudi, R.D.; Patwardhan, R.S.; Shah, S.L.; Patwardhan, S.C. Detection of model-plant mismatch in MPC applications. J. Process Control 2009, 19, 1305–1313. [Google Scholar] [CrossRef]
  141. Sun, Z.; Qin, S.J.; Singhal, A.; Megan, L. Performance monitoring of model-predictive controllers via model residual assessment. J. Process Control 2013, 23, 473–482. [Google Scholar] [CrossRef]
  142. Chen, J. Statistical Methods for Process Monitoring and Control. Master’s Thesis, McMaster University, Hamilton, ON, Canada, 2014. [Google Scholar]
  143. Pannocchia, G.; De Luca, A.; Bottai, M. Prediction Error Based Performance Monitoring, Degradation Diagnosis and Remedies in Offset-Free MPC: Theory and Applications. Asian J. Control 2014, 16, 995–1005. [Google Scholar] [CrossRef]
  144. Zhao, Y.; Chu, J.; Su, H.; Huang, B. Multi-step prediction error approach for controller performance monitoring. Control Eng. Pract. 2010, 18, 1–12. [Google Scholar] [CrossRef]
  145. Botelho, V.; Trierweiler, J.O.; Farenzena, M.; Duraiski, R. Methodology for Detecting Model–Plant Mismatches Affecting Model Predictive Control Performance. Ind. Eng. Chem. Res. 2015, 54, 12072–12085. [Google Scholar] [CrossRef]
  146. Botelho, V.; Trierweiler, J.O.; Farenzena, M.; Duraiski, R. Perspectives and challenges in performance assessment of model predictive control. Can. J. Chem. Eng. 2016, 94, 1225–1241. [Google Scholar] [CrossRef]
  147. Lee, K.H.; Huang, B.; Tamayo, E.C. Sensitivity analysis for selective constraint and variability tuning in performance assessment of industrial MPC. Control Eng. Pract. 2008, 16, 1195–1215. [Google Scholar] [CrossRef]
  148. Godoy, J.L.; Ferramosca, A.; González, A.H. Economic performance assessment and monitoring in LP-DMC type controller applications. J. Process Control 2017, 57, 26–37. [Google Scholar] [CrossRef]
  149. Rodrigues, J.A.D.; Maciel Filho, R. Analysis of the predictive DMC controller performance applied to a feed-batch bioreactor. Braz. J. Chem. Eng. 1997, 14. [Google Scholar] [CrossRef]
  150. Zhang, Q.; Li, S. Performance Monitoring and Diagnosis of Multivariable Model Predictive Control Using Statistical Analysis. Chin. J. Chem. Eng. 2006, 14, 207–215. [Google Scholar] [CrossRef]
  151. AlGhazzawi, A.; Lennox, B. Model predictive control monitoring using multivariate statistics. J. Process Control 2009, 19, 314–327. [Google Scholar] [CrossRef]
  152. Chen, Y.T.; Li, S.Y.; Li, N. Performance analysis on dynamic matrix controller with single prediction strategy. In Proceeding of the 11th World Congress on Intelligent Control and Automation, Shenyang, China, 29 June–4 July 2014; pp. 1694–1699. [Google Scholar]
  153. Khan, M.; Tahiyat, M.; Imtiaz, S.; Choudhury, M.A.A.S.; Khan, F. Experimental evaluation of control performance of MPC as a regulatory controller. ISA Trans. 2017, 70, 512–520. [Google Scholar] [CrossRef]
  154. Shang, L.Y.; Tian, X.M.; Cao, Y.P.; Cai, L.F. MPC Performance Monitoring and Diagnosis Based on Dissimilarity Analysis of PLS Cross-product Matrix. Acta Autom. Sin. 2017, 43, 271–279. [Google Scholar]
  155. Shang, L.; Wang, Y.; Deng, X.; Cao, Y.; Wang, P.; Wang, Y. An Enhanced Method to Assess MPC Performance Based on Multi-Step Slow Feature Analysis. Energies 2019, 12, 3799. [Google Scholar] [CrossRef][Green Version]
  156. Shang, L.; Wang, Y.; Deng, X.; Cao, Y.; Wang, P.; Wang, Y. A Model Predictive Control Performance Monitoring and Grading Strategy Based on Improved Slow Feature Analysis. IEEE Access 2019, 7, 50897–50911. [Google Scholar] [CrossRef]
  157. Domański, P.D.; Ławryńczuk, M. Assessment of the GPC Control Quality using Non-Gaussian Statistical Measures. Int. J. Appl. Math. Comput. Sci. 2017, 27, 291–307. [Google Scholar] [CrossRef]
  158. Domański, P.D.; Ławryńczuk, M. Assessment of Predictive Control Performance using Fractal Measures. Nonlinear Dyn. 2017, 89, 773–790. [Google Scholar] [CrossRef][Green Version]
  159. Domański, P.D.; Ławryńczuk, M. Multi-Criteria Control Performance Assessment Method for Multivariate MPC. In Proceedings of the 2020 American Control Conference, Denver, CO, USA, 1–3 July 2020. accepted for publication. [Google Scholar]
  160. Domański, P.D.; Ławryńczuk, M. Control Quality Assessment of Nonlinear Model Predictive Control Using Fractal and Entropy Measures. In Nonlinear Dynamics and Control; Lacarbonara, W., Balachandran, B., Ma, J., Tenreiro Machado, J.A., Stepan, G., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 147–156. [Google Scholar]
  161. Xu, Y.; Li, N.; Li, S. A Data-driven performance assessment approach for MPC using improved distance similarity factor. In Proceedings of the 2015 IEEE 10th Conference on Industrial Electronics and Applications (ICIEA), Auckland, New Zealand, 15–17 June 2015; pp. 1870–1875. [Google Scholar]
  162. Xu, Y.; Zhang, G.; Li, N.; Zhang, J.; Li, S.; Wang, L. Data-Driven Performance Monitoring for Model Predictive Control Using a mahalanobis distance based overall index. Asian J. Control 2019, 21, 891–907. [Google Scholar] [CrossRef]
  163. Gao, J.; Patwardhan, R.; Akamatsu, K.; Hashimoto, Y.; Emoto, G.; Shah, S.L.; Huang, B. Performance evaluation of two industrial MPC controllers. Control Eng. Pract. 2003, 11, 1371–1387. [Google Scholar] [CrossRef]
  164. Jiang, H.; Shah, S.L.; Huang, B.; Wilson, B.; Patwardhan, R.; Szeto, F. Performance Assessment and Model Validation of Two Industrial MPC Controllers. IFAC Proc. Vol. 2008, 41, 8387–8394. [Google Scholar] [CrossRef][Green Version]
  165. Claro, É.R.; Botelho, V.; Trierweiler, J.O.; Farenzena, M. Model Performance Assessment of a Predictive Controller for Propylene/Propane Separation. IFAC-PapersOnLine 2016, 49, 978–983. [Google Scholar] [CrossRef]
  166. Botelho, V.R.; Trierweiler, J.O.; Farenzena, M.; Longhi, L.G.S.; Zanin, A.C.; Teixeira, H.C.G.; Duraiski, R.G. Model assessment of MPCs with control ranges: An industrial application in a delayed coking unit. Control Eng. Pract. 2019, 84, 261–273. [Google Scholar] [CrossRef]
  167. Domański, P.D. Statistical measures for proportional–integral–derivative control quality: Simulations and industrial data. Proc. Inst. Mech. Eng. Part I J. Syst. Control Eng. 2018, 232, 428–441. [Google Scholar] [CrossRef]
Figure 1. The hierarchical layout of a control system. APC, Advanced Process Control.
Figure 1. The hierarchical layout of a control system. APC, Advanced Process Control.
Algorithms 13 00097 g001
Figure 2. Receding horizon predictive control principle.
Figure 2. Receding horizon predictive control principle.
Algorithms 13 00097 g002
Figure 3. CPA industrial utilization process.
Figure 3. CPA industrial utilization process.
Algorithms 13 00097 g003
Figure 4. CPA techniques classification.
Figure 4. CPA techniques classification.
Algorithms 13 00097 g004
Figure 5. MPC internal approach to the CPA.
Figure 5. MPC internal approach to the CPA.
Algorithms 13 00097 g005
Figure 6. MPC external approach to the CPA.
Figure 6. MPC external approach to the CPA.
Algorithms 13 00097 g006

Share and Cite

MDPI and ACS Style

Domański, P.D. Performance Assessment of Predictive Control—A Survey. Algorithms 2020, 13, 97. https://doi.org/10.3390/a13040097

AMA Style

Domański PD. Performance Assessment of Predictive Control—A Survey. Algorithms. 2020; 13(4):97. https://doi.org/10.3390/a13040097

Chicago/Turabian Style

Domański, Paweł D. 2020. "Performance Assessment of Predictive Control—A Survey" Algorithms 13, no. 4: 97. https://doi.org/10.3390/a13040097

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop