Geometric Properties of Quasi-Additive Learning Algorithms

Kazushi IKEDA

Publication
IEICE TRANSACTIONS on Fundamentals of Electronics, Communications and Computer Sciences   Vol.E89-A    No.10    pp.2812-2817
Publication Date: 2006/10/01
Online ISSN: 1745-1337
DOI: 10.1093/ietfec/e89-a.10.2812
Print ISSN: 0916-8508
Type of Manuscript: Special Section PAPER (Special Section on Nonlinear Theory and its Applications)
Category: Control, Neural Networks and Learning
Keyword: 
quasi-additive algorithms,  perceptron,  information geometry,  

Full Text: PDF(208KB)>>
Buy this Article



Summary: 
The family of Quasi-Additive (QA) algorithms is a natural generalization of the perceptron learning, which is a kind of on-line learning having two parameter vectors: One is an accumulation of input vectors and the other is a weight vector for prediction associated with the former by a nonlinear function. We show that the vectors have a dually-flat structure from the information-geometric point of view, and this representation makes it easier to discuss the convergence properties.


open access publishing via