Elsevier

Journal of Complexity

Volume 9, Issue 4, December 1993, Pages 427-446
Journal of Complexity

Invited Article
Information-Based Complexity and Nonparametric Worst-Case System Identification

https://doi.org/10.1006/jcom.1993.1028Get rights and content
Under an Elsevier user license
open archive

Abstract

In this paper we review recent results on nonparametric approaches to identification of linear dynamic systems, under nonprobabilistic assumptions on measurement uncertainties. Two main categories of problems are considered in the paper: H and l1 settings. The H setting assumes that the true system is linear time-invariant and the available information is represented by samples of the frequency response of the system, corrupted by an l-norm bounded noise. The aim is to estimate a proper, stable finite-dimensional model. The estimation error is quantified according to an H norm, measuring the "distance" of the estimated model from the worst-case system in the class of allowable systems, for the worst-case realization of the measurement error. In the l1 setting, the aim is to identify the samples of the impulse response of an unknown linear time-invariant system. The available information is given by input/output measurements corrupted by l-bounded noise and the estimation error is measured according to an l1 norm, for the worst case with respect to allowable systems and noise. In this paper, the main results available in the literature for both settings are reviewed, with particular attention to (a) evaluation of the diameter of information under various experimental conditions, (b) convergence to zero of the diameter of information (i.e., existence of robustly convergent identification procedures), and (c) computation of optimal and almost-optimal algorithms. Some results are also reported for the l setting, similar to the l1 setting, with the exception of the estimation error, which is measured by an l norm.

Cited by (0)