Improved least squares identification algorithm for multivariable Hammerstein systems☆
Introduction
The single variable Hammerstein system consists of a nonlinear block plus a linear block, is popular for modelling various practical nonlinear systems [1]. The identification methods for single variable Hammerstein systems include the over-parameterization method [2], [3], the iterative identification method [4], [5], [6], the hierarchical identification method [7], [8], [9], the key-term separation principle based method [10], [11], [12], the maximum likelihood estimation method [13], [14], [15].
Multivariable linear and nonlinear systems widely exists in complex systems. Many identification methods of multivariable linear systems have been proposed, including the subspace identification methods [16], [17], [18], the hierarchical identification methods [19], [20], [21], the maximum likelihood estimation methods [22], [23], [24]. For decades, multivariable Hammerstein nonlinear systems have aroused much research interests in the modelling and control fields. Van der Veen et al. applied a separable least squares algorithm to identify the parameters of multivariable wind turbine Hammerstein systems with two input nonlinearities by using the over-parametrization based method [25]. Chan et al. transformed the nonlinear identification problem into a linear one by using the cardinal cubic spline functions to model the static nonlinearities [26]. Ikhouane and Giri studied a unified approach for the parametric identification of multivariable Hammerstein systems [27], by transforming the Hammerstein identification problem into a linear identification problem. Jafari et al. investigated a hierarchical least squares iterative algorithm to simultaneously estimate the unknown parameter vector and the matrix of Hammerstein systems [28].
The objective of this paper is, by using the Taylor expansion in a least squares quadratic criterion function, to investigate an improved least squares (ILS) algorithm to identify the parameters of a multivariable Hammerstein OEMA nonlinear system, whose identification model is not a regression model. The parameter vector of the proposed method is defined as a unified vector of all parameter vectors in a non-regression model of the system; the information vector is defined as the derivative of the noise variable to the unified parameter vector.
The characteristic of the ILS estimation method is that its identification model contains the minimum number of the unknown parameters, having a high computational efficiency. The contributions of this paper lie in the following aspects.
- •
The ILS method can directly exert on the non-regression identification model of the multivariable Hammerstein OEMA system, by defining the parameter vector as a unified vector of all parameter vectors and the information vector as the derivative of the noise variable to the unified parameter vector.
- •
The ILS method contains the minimum number of the unknown parameters, and is more efficient than the traditional over-parameterization least squares method. In over-parameterization identification models of Hammerstein systems, the parameter vector contains cross-products between parameters in the nonlinear block and in the linear block, resulting in many redundant parameters and very large computational load.
Recently, Li et al. explored an efficient estimation approach—the kernel machine and space projection (KMSP) method to identify Hammerstein, Wiener, Hammerstein–Wiener models [29]. The differences between the proposed ILS method and the KMSP identification method are mainly in the following aspects. First, in the KMSP approach, the kernel machine is used to represent the functions and the space projection is used to separate the represented functions; the ILS method expresses the nonlinear block by a bilinear function, and is performed to solve parameters of the bilinear system. Second, the KMSP method uses the kernel machines to transform the model to a solvable problem, and to solve the transformed problem by a space projection approach, while the improved least squares approach defines a unified vector of all parameter vectors as the parameter vector, and the derivative of the noise variable to the unified parameter vector as the information vector.
The rest of this paper is organized as follows. Section 2 constructs the identification model for a multivariable Hammerstein OEMA system; Section 3 investigates the improved least squares (ILS) algorithm for the multivariable Hammerstein OEMA system; Section 4 derives the over-parameterization based least squares (OP-LS) algorithm for a comparison; Section 5 analyzes the convergence of the ILS algorithm. Numerical simulations are carried out in Section 6 to demonstrate the effectiveness of the proposed algorithm. Finally, some concluding remarks are offered in Section 7.
Section snippets
The identification model of the multivariable Hammerstein OEMA system
Let us introduce some notation. “” and “” stands for “A is defined as B”; the symbol stands for an identity matrix of appropriate size; the superscript T denotes the matrix/vector transpose. stands for the estimate of z at time t; the norm of the matrix is defined by .
Consider a multivariable Hammerstein output error moving average (OEMA) system in Fig. 1, where
The improved least squares algorithm (the ILS algorithm)
In this section, we adopt the Taylor expansion on a noise term and on a least squares quadratic criterion function to investigate the improved least squares estimation method of the system. Define a unified parameter vector containing all unknown parameter vectors of the model (9):Let represent the parameter estimate of at instant t, the Taylor expansion of v(t) at can be expressed aswhere the
The over-parameterization based least squares algorithm (the OP-LS algorithm)
For comparisons, we simply derive the over-parameterization based least squares algorithm.
Rewrite Eqs. (5), (7) asThus, we have
The convergence analysis of the ILS algorithm
The convergence analysis of the ILS algorithm for the Hammerstein OEMA system is illustrated as follows. Assume that the σ algebra sequence is generated by v(t), and is a martingale difference sequence on a probability space [31]. The sequence satisfies Theorem 1 For the ILS algorithm in Eqs. (28), (29), (30), (31), (32), (33), (34), (35), (36), (37), (38), (39), (40)
Example
Consider the following multivariable Hammerstein OEMA system with two inputs:The input is taken as an uncorrelated
Conclusions
In this paper, we investigate an identification problem of a multivariable Hammerstein OEMA system. By using two Taylor expansion on the noise variable and on the least squares quadratic criterion function, we present an improved least squares algorithm to identify the parameters of the bilinear identification model of the multivariable Hammerstein OEMA system. The proposed ILS algorithm can reduce the computational load compared with the OP-LS algorithm and can combine the auxiliary model
References (36)
- et al.
Identification methods for Hammerstein nonlinear systems
Digit. Signal Process.
(2011) - et al.
Parameter estimation for an input nonlinear state space system with time delay
J. Frankl. Inst.
(2014) - et al.
Convergence of normalized iterative identification of Hammerstein systems
Syst. Control Lett.
(2011) - et al.
Iterative identification of block-oriented nonlinear systems based on biconvex optimization
Syst. Control Lett.
(2015) Hierarchical multi-innovation stochastic gradient algorithm for Hammerstein nonlinear system modeling
Appl. Math. Model.
(2013)Identification of nonlinear dynamic systems with input saturation and output backlash using three-block cascade models
J. Frankl. Inst.
(2014)Parameter estimation for Hammerstein CARARMA systems based on the Newton iteration
Appl. Math. Lett.
(2013)- et al.
Kernel methods for subspace identification of multivariable LPV and bilinear systems
Automatica
(2005) - et al.
Subspace identification of MIMO LPV systems using a periodic scheduling sequence
Automatica
(2007) - et al.
Subspace identification of bilinear and LPV systems for open- and closed-loop data
Automatica
(2009)
Hierarchical least-squares based iterative identification for multivariable systems with moving average noises
Math. Comput. Model.
Maximum-likelihood estimation for multi-aspect multi-baseline SAR interferometry of urban areas
ISPRS J. Photogramm. Remote Sens.
Identification of MIMO Hammerstein systems using cardinal spline functions
J. Process Control
A unified approach for the parametric identification of SISO/MIMO Wiener and Hammerstein systems
J. Frankl. Inst.
Identification of multivariable nonlinear systems in the presence of colored noises using iterative hierarchical least squares algorithm
ISA Trans.
Recursive least squares parameter identification for systems with colored noise using the filtering technique and the auxiliary model
Digit. Signal Process.
Hierarchical parameter estimation algorithms for multivariable systems using measurement information
Inf. Sci.
Recursive least squares estimation algorithm applied to a class of linear-in-parameters output error moving average systems
Appl. Math. Lett.
Cited by (89)
Model recovery for Hammerstein systems using the hierarchical orthogonal matching pursuit method
2019, Journal of Computational and Applied MathematicsModel recovery for Hammerstein systems using the auxiliary model based orthogonal matching pursuit method
2018, Applied Mathematical ModellingA multi-innovation state and parameter estimation algorithm for a state space system with d-step state-delay
2017, Signal ProcessingCitation Excerpt :The typical parameter estimation algorithms include the recursive methods and the iterative methods. They have widely applications in finding the roots of equations or solving matrix equations and in implementing the parameter estimation algorithms [9]. Xu et al. studied parameter estimation and controller design for dynamic systems from the step responses based on the Newton iteration [10] and presented a damping iterative parameter identification method for dynamical systems based on the sine signal measurement [11].
A recursive least squares parameter estimation algorithm for output nonlinear autoregressive systems using the input–output data filtering
2017, Journal of the Franklin InstituteRobust minimum covariance constrained control for C<inf>3</inf> hydrogenation process and application
2023, Huagong Xuebao/CIESC JournalFRACTIONAL ORDER LEARNING METHODS FOR NONLINEAR SYSTEM IDENTIFICATION BASED ON FUZZY NEURAL NETWORK
2023, International Journal of Numerical Analysis and Modeling
- ☆
This work was supported by the National Natural Science Foundation of China (Nos. 61573205, 61472195), and the Shandong Provincial Natural Science Foundation of China (No. ZR2015FM017).