|
For Full-Text PDF, please login, if you are a member of IEICE,
or go to Pay Per View on menu list, if you are a nonmember of IEICE.
|
Generalization Performance of Subspace Bayes Approach in Linear Neural Networks
Shinichi NAKAJIMA Sumio WATANABE
Publication
IEICE TRANSACTIONS on Information and Systems
Vol.E89-D
No.3
pp.1128-1138 Publication Date: 2006/03/01 Online ISSN: 1745-1361
DOI: 10.1093/ietisy/e89-d.3.1128 Print ISSN: 0916-8532 Type of Manuscript: PAPER Category: Algorithm Theory Keyword: empirical Bayes, variational Bayes, neural networks, reduced-rank regression, James-Stein, unidentifiable,
Full Text: PDF(330.2KB)>>
Summary:
In unidentifiable models, the Bayes estimation has the advantage of generalization performance over the maximum likelihood estimation. However, accurate approximation of the posterior distribution requires huge computational costs. In this paper, we consider an alternative approximation method, which we call a subspace Bayes approach. A subspace Bayes approach is an empirical Bayes approach where a part of the parameters are regarded as hyperparameters. Consequently, in some three-layer models, this approach requires much less computational costs than Markov chain Monte Carlo methods. We show that, in three-layer linear neural networks, a subspace Bayes approach is asymptotically equivalent to a positive-part James-Stein type shrinkage estimation, and theoretically clarify its generalization error and training error. We also discuss the domination over the maximum likelihood estimation and the relation to the variational Bayes approach.
|
open access publishing via
|
|
|
|
|
|
|
|