ABSTRACT
The Fisher information matrix (FIM) is a critical quantity in several aspects of mathematical modeling, including input selection, model selection, and confidence region calculation. For example, the determinant of the FIM is the main performance metric for choosing input values in a scientific experiment with the aims of achieving the most accurate resulting parameter estimates in a mathematical model. However, analytical determination of the FIM in a general setting, especially in nonlinear models, may be difficult or almost impossible due to intractable modeling requirements and/or intractable high-dimensional integration.
To circumvent these difficulties, a Monte Carlo (MC) simulation-based technique, resampling algorithm, based on the values of log-likelihood function or its exact stochastic gradient computed by using a set of pseudo data vectors, is usually recommended. This paper proposes an extension of the current algorithm in order to enhance the statistical characteristics of the estimator of the FIM. This modified algorithm is particularly useful in those cases where the FIM has a structure with some elements being analytically known from prior information and the others being unknown. The estimator of the FIM, obtained by using the proposed algorithm, simultaneously preserves the analytically known elements and reduces the variances of the estimators of the unknown elements by capitalizing on the information contained in the known elements.
- J. Spall, Introduction to Stochastic Search and Optimization: Estimation, Simulation and Control. Wiley-Interscience, 2003. Google ScholarDigital Library
- J. Spall, "Monte carlo computation of the Fisher information matrix in nonstandard settings," J. Comput. Graph. Statist., vol. 14, no. 4, pp. 889--909, 2005.Google ScholarCross Ref
- S. Das, R. Ghanem, and J. C. Spall, "Asymptotic Sampling Distribution for Polynomial Chaos Representation of Data: A Maximum Entropy and Fisher information approach," in Proc. of the 45th IEEE Conference on Decision and Control, San Diego, CA, USA, Dec 13--15, 2006, CD rom.Google Scholar
- P. Bickel and K. Doksum, Mathematical Statistics: Basic Ideas and Selected Topics, Vol I. Prentice Hall, 2001.Google Scholar
- J. Spall, "Multivariate stochastic approximation using a simultaneous perturbation gradient approximation," IEEE Trans. Automat. Control, vol. 37, no. 3, pp. 332--341, 1992.Google ScholarCross Ref
- J. C. Spall, "Feedback and weighting mechanisms for improving Jacobian (Hessian) estimates in the adaptive simultaneous perturbation algorithm," in Proc. of the 2006 American Control Conference, Minneapolis, Minnesota, USA, June 14--16, 2006, pp. 3086--3091.Google Scholar
- S. Das, "Efficient calculation of Fisher information matrix: Monte Carlo approach using prior information," Master's thesis, Department of Applied Mathematics and Statistics, The Johns Hopkns University, Baltimore, Maryland, USA, May 2007, http://dspace.library.jhu.edu/handle/1774.2/32459.Google Scholar
Recommendations
Efficient Monte Carlo computation of Fisher information matrix using prior information
The Fisher information matrix (FIM) is a critical quantity in several aspects of mathematical modeling, including input selection and confidence region calculation. Analytical determination of the FIM in a general setting, especially in nonlinear models,...
A new method for evaluation of the Fisher information matrix for discrete mixed effect models using Monte Carlo sampling and adaptive Gaussian quadrature
The design of experiments for discrete mixed effect models is challenging due to the unavailability of a closed-form expression for the Fisher information matrix (FIM), on which most optimality criteria depend. Existing approaches for the computation of ...
On Fisher Information Matrix, Array Manifold Geometry and Time Delay Estimation
Geometric Science of InformationAbstractThe Fisher information matrix is used to evaluate the minimum variance-covariance of unbiased parameter estimation. It is also used, in natural gradient descent algorithms, to improve learning of modern neural networks. We investigate the Fisher ...
Comments