single-jc.php

JACIII Vol.12 No.1 pp. 85-93
doi: 10.20965/jaciii.2008.p0085
(2008)

Paper:

A Global Optimization Method RasID-GA for Neural Network Training

Dongkyu Sohn, Shingo Mabu, Kaoru Shimada,
Kotaro Hirasawa, and Jinglu Hu

Graduate School of Information, Production and Systems, Waseda University, 2-7 Hibikino, Wakamatus-ku, Kitakyushu-shi, Fukuoka, 808-0135, JAPAN

Received:
May 1, 2007
Accepted:
October 2, 2007
Published:
January 20, 2008
Keywords:
optimization, RasID, GA, switching
Abstract
This paper applies an Adaptive Random search with Intensification and Diversification combined with Genetic Algorithm (RasID-GA) to neural network training. In the previous work, we proposed RasID-GA which combines the best properties of RasID and Genetic Algorithm for optimization. Neural networks are widely used in pattern recognition, system modeling, prediction and other areas. Although most neural network training uses gradient based schemes such as well-known back-propagation (BP), but sometimes BP is easily dropped into local minima. In this paper, we train newly developed multi-branch neural networks using RasID-GA with constraint coefficient C by which the feasible solution space is controlled. In addition, we use Mackey-Glass time prediction to test a generalization ability of the proposed method.
Cite this article as:
D. Sohn, S. Mabu, K. Shimada, K. Hirasawa, and J. Hu, “A Global Optimization Method RasID-GA for Neural Network Training,” J. Adv. Comput. Intell. Intell. Inform., Vol.12 No.1, pp. 85-93, 2008.
Data files:
References
  1. [1] N. Baba, “A new approach for finding the global minimum of error function of neural networks,” Neural Networks, Vol.2, pp. 367-373, 1989.
  2. [2] N. Baba, “A hybrid algorithm for finding a global minimum,” Int. Journal of Control, Vol.37, pp. 929-942, 1983.
  3. [3] D. Sohn, H. Hatakeyama, S. Mabu, K. Hirasawa, and J. Hu, “Adaptive Random Search with Intensification and Diversification Combined with Genetic Algorithm,” Journal of Advanced Computational Intelligence and Intelligent Informatics, Vol. 10, No. 6, pp. 954-963, 2006.
  4. [4] D. Sohn, K. Hirasawa, and J. Hu, “Adaptive Random Search with Intensification and Diversification combined with Genetic Algorithm,” Congress on Evolutionary Computation 2005 (CEC2005), pp. 1462-1469, 2005.
  5. [5] K. Hirasawa, H. Miyazaki, and J. Hu, “Enhancement of RasID and Its Evaluation,” T.SICE, Vol.38, No.9, pp. 775-783, 2002.
  6. [6] K. Hirasawa, K. Togo, J. Hu, M. Ohbayashi, and J. Murata, “A New Adaptive Random Search Method in Neural Networks –RasID–,” T.SICE, Vol.34, No.8, pp. 1088-1096, 1998.
  7. [7] J. Hu, K. Hirasawa, and J. Murata, “RasID-Random Search for Neural Network Training,” Journal of Advanced Computational Intelligence Informatics, Vol.2, No.4, pp. 134-141, 1998.
  8. [8] J. Hu and K. Hirasawa, “Adaptive random search approach to identification of neural network model,” Proc. of the 31st ISCIE Int. Symposium on Stochastic Systems Theory and its Applications, Yokohama, pp. 73-78, Nov. 11-12, 1999.
  9. [9] J. Holland, “Adaptation in Natural and Artificial System,” Ann Arbor, MIT University of Michigan Press, 1975.
  10. [10] P. A. Moscato, “On evolution, search, optimization, genetic algorithms and martial arts: Toward memetic algorithms,” Caltech Concurrent Computation program, California Institute of Technology, Pasadena, Tech. Rep. 790, 1989.
  11. [11] D. Molina, F. Herrera, and M. Lozano, “Adaptive Local Search Parameters for Real-Coded Memetic Algorithms,” Congress on Evolutionary Computation 2005 (CEC2005), pp. 888-895, 2005.
  12. [12] J. Matyas, “Random optimization,” Automation and Remote Control, Vol.26, pp. 244-251, 1965.
  13. [13] F. J. Solis and J. B. Wets, “Minimization by random search techniques,” Mathematics of Operations Research, Vol.6, pp. 19-30, 1981.
  14. [14] A. Torn and A. Zilinskas, “Global optimization,” in Lecture Notes in Computer Science, 350, Berlin Germany, Springer-Verlag, 1989.
  15. [15] T. Yamashita, K. Hirasawa, J. Hu, and J. Murata, “Multi-Branch Structure of Layered Neural Networks,” Int. Conf. on Neural Information Processing, pp. 243-247, Nov., 2002.
  16. [16] T. Yamashita, K. Hirasawa, and J. Hu, “Multi-Branch Structure and its Localized Property in Layered Neural Networks,” Proc. of IJCNN, pp. 1039-1044, 2004.
  17. [17] K. Hirasawa, X. Wang, J. Murata, J. Hu, and C. Jin, “Universal learning network and its application to chaos control,” Neural Networks, Vol.13, pp. 239-253, 2000.
  18. [18] K. Hirasawa, M. Ohbayashi, H. Fujita, and M. Koga, “Universal Learning Network Theory,” Trans. of Institute of Electrical Engineers of Japan, Vol.116-C, No.7, pp. 794-801, 1996.
  19. [19] M. Farzad, H. Tahersima, and H. Khaloozadeh, “Predicting the Mackey Glass Chaotic Time Series Using Genetic Algorithm,” SICE-ICASE Int. Joint Conf., pp. 5460-5463, Oct., 2006.
  20. [20] Z. Shi and M. Han, “Support Vector Echo-State Machine for ChaoticTime-Series Prediction,” IEEE trans. Neural Networks, Vol.18, No.2, pp. 359-372, Mar., 2007.
  21. [21] T. Kondo and J. Ueno, “Revised GMDH-Type Neural Network Algorithm with a Feedback Loop Identifying Sigmoid Function Neural Network,” Int. Journal of Innovative Computing, Information and Control, Vol.2, No.5, pp. 985-996, 2006.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024