Abstract
Optimization theory and method profoundly impact numerous engineering designs and applications. The gradient descent method is simpler and more extensively used to solve numerous optimization problems than other search methods. However, the gradient descent method is easily trapped into a local minimum and slowly converges. This work presents a Gradient Forecasting Search Method (GFSM) for enhancing the performance of the gradient descent method in order to resolve optimization problems.
GFSM is based on the gradient descent method and on the universal Discrete Difference Equation Prediction Model (DDEPM) proposed herein. In addition, the concept of the universal DDEPM is derived from the grey prediction model. The original grey prediction model uses a mathematical hypothesis and approximation to transform a continuous differential equation into a discrete difference equation. This is not a logical approach because the forecasting sequence data is invariably discrete. To construct a more precise prediction model, this work adopts a discrete difference equation. GFSM proposed herein can accurately predict the precise searching direction and trend of the gradient descent method via the universal DDEPM and can adjust prediction steps dynamically using the golden section search algorithm.
Experimental results indicate that the proposed method can accelerate the searching speed of gradient descent method as well as help the gradient descent method escape from local minima. Our results further demonstrate that applying the golden section search method to achieve dynamic prediction steps of the DDEPM is an efficient approach for this search algorithm.
Similar content being viewed by others
References
G.V. Reklaitis, A. Ravindran, and K.M. Ragsdell, Engineering Optimization Methods and Applications, Wiley, New York, 1983.
P.E. Gill, W. Murray, and M.H. Wright, Practical Optimization, Harcourt Brace Jovanovich, London, 1981.
M.-S. Chen, “Control of linear time-varying systems by the gradient algorithm,” IEEE Conference on Decision & Control, vol. 5, pp. 4549–4553, 1997.
S.-H. So and D.J. Park, “Design of gradient descent based selforganizing fuzzy logic controller with dual outputs,” IEEE International Conference on Fuzzy Systems, vol. 1, pp. 460–464, 1999.
Y. Shi, M. Mizumoto, N. Yubazaki, and M. Otani, “A learning algorithm for tuning fuzzy rules based on the gradient descent method,” IEEE International Conference on Neural Networks, vol. 1, pp. 55–61, 1996.
D.C. Park and I. Dagher, “Gradient based fuzzy C-means (GBFCM) algorithm,” IEEE International Conference on Fuzzy Systems, vol. 3, pp. 1626–1631, 1994.
D.E. Rumelhart, G.E. Hiton, and R.J. Williams, “Learning internal representation by error propagation,” Parallel Distributed Processing, vol. 1, pp. 318–362, 1986.
C. Charalambous, “Conjugate gradient algorithm for efficient training of artificial neural networks,” IEEE Proceedings-G, vol. 139, no. 3, pp. 301–310, 1992.
S. Amari and S.C. Douglas, “Why natural gradient?,” in Proceedings of IEEE International Conference Acoust., Speech, Signal Processing, 1998, pp. 1213–1216.
Shun-ichi Amari, “Natural gradient works efficiently in learning,” Neural Computation, vol. 10, pp. 251–276, 1998.
R.A. Jacobs, “Increased rates of convergence through learning rate adaptation,” Neural Networks, vol. 1, pp. 295–307, 1988.
N. Cesa-Bianchi, “Analysis of two gradient-based algorithms for on-line regression,” Journal of Computer and System Sciences, vol. 59, pp. 392–411, 1999.
S.C. Ng, S.H. Leung, C.Y. Chung, A. Luk, and W.H. Lau, “A new learning algorithm for adaptive IIR filtering,” IEEE Signal Processing Magazine, vol. 13, no. 6, pp. 38–46, 1996.
D. Precup and R.S. Sutton, “Exponentiated gradient methods for reinforcement learning,” in Proceedings of the Fourteenth International Conference on Machine Learning, San Mateo, CA, 1997.
J.L. Deng, “Control problems of grey system,” Systems & Control Letters, vol. 1, pp. 288–294, 1982.
J.L. Deng, “Introduction to grey system theory,” The Journal of Grey System, vol. 1, pp. 1–24, 1989.
T.R. Chandrupatla, “An efficient equdratic fit-sectioning algorithm for minimization without derivatives,” Computer Methods in Applied Mechanics and Engineering, vol. 152, no. 12, pp. 211–217, 1998.
L. Pronzato, H.P. Wynn, and A.A. Zhigljavsky, “A generalized golden-section algorithm for line search,” IMA Journal of Mathematical Control and Information, vol. 15, no. 2, pp. 185–214, 1998.
S.S. Panwar, T.K. Philips, and M.S. Chen, “Golden ratio scheduling for flow control with low buffer requirements,” IEEE Transaction on Communications, vol. 40, no. 4, pp. 765–772, 1992.
I.M. Ei-Amin, S.O. Duffuaa, and A.U. Bawah, “Optimal shunt compensators at nonsinusoidal busbars,” IEEE Transaction on Power System, vol. 10, no. 2, pp. 716–722, 1995.
T.C. Hsia, System Identification: Least Square Methods University of California, Lexington, Davis.
D.M. Bates and D.G.Watts, Nonlinear Regression Analysis and its Applications, John Wiley & Sons, New York, 1988.
M.-F.Yeh, “Studies and applications of GM(1,1) model and grey relational analysis,” Ph.D. Dissertation, Department of Electrical Engineering, Tatung Institute of Technology, Taiwan, 1999.
M.S. Bazarara, H.D. Sherali, and C.M. Shetty, Nonlinear Programming Theory and Algorithms, John Wiley & Sons, New York, 1993.
D.E. Goldberg, “Genetic algorithms in search, optimization, and machine learning,” Addison-Wesley, Massachusetts, 1989.
M. Gori and A. Tesi, “On the problem of local minima in backpropagation,” IEEE Transaction on Pattern Analysis and Machine Intelligence, vol. 14, no. 1, pp. 76–85, 1992.
R. Fisher, “The use of multiple measurements in taxonomic problems,” Annals of Eugenics, vol. 7, no. 2, pp. 179–188, 1936.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Chen, CM., Lee, HM. An Efficient Gradient Forecasting Search Method Utilizing the Discrete Difference Equation Prediction Model. Applied Intelligence 16, 43–58 (2002). https://doi.org/10.1023/A:1012817410590
Issue Date:
DOI: https://doi.org/10.1023/A:1012817410590