Skip to main content

Is pocket algorithm optimal?

  • Conference paper
  • First Online:
Computational Learning Theory (EuroCOLT 1995)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 904))

Included in the following conference series:

  • 211 Accesses

Abstract

The pocket algorithm is considered able to provide for any classification problem the weight vector which satisfies the maximum number of input-output relations contained in the training set. A proper convergence theorem ensures the achievement of an optimal configuration with probability one when the number of iterations grows indefinitely. In the present paper a new formulation of this theorem is given; a rigorous proof corrects some formal and substantial errors which invalidate previous theoretical results. In particular it is shown that the optimality of the asymptotical solution is ensured only if the number of permanences for the pocket vector lies in a proper interval of the real axis which bounds depend on the number of iterations.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Cybenko, G. Approximation by superpositions of a sigmoidal function. Mathematics of Control, Signals, and Systems2 (1989), 303–314.

    Google Scholar 

  2. Hornik, K., Stinchcombe, M., and White, H. Multilayer feedforward networks are universal approximators. Neural Networks2 (1989), 359–366.

    Google Scholar 

  3. Blum, A., AND Rivest, R. L. Training a 3-node neural network is NP-complete. In Proceedings of the 1988 Workshop on Computational Learning Theory (Cambridge, MA, 1988), D. Haussler and L. Pitt, Eds., Morgan Kaufmann, pp. 9–18.

    Google Scholar 

  4. Hertz, J., Krogh, A., AND Palmer, R. G.Introduction to the Theory of Neural Computation. Redwood City, CA: Addison-Wesley, 1991.

    Google Scholar 

  5. Gallant, S. I.Neural Networks Learning and Expert Systems. Cambridge, MA: MIT Press, 1993.

    Google Scholar 

  6. Rosenblatt, F.Principles of Neurodynamics. Washington, DC: Spartan Press, 1961.

    Google Scholar 

  7. Minsky, M., AND Papert, S.Perceptrons: An Introduction to Computational Geometry. Cambridge, MA: MIT Press, 1969.

    Google Scholar 

  8. Ho, Y.-C., AND Kashyap, R. L. An algorithm for linear inequalities and its applications. IEEE Transactions on Electronic Computers14 (1965), 683–688.

    Google Scholar 

  9. Khachiyan, L. G. A polynomial algorithm in linear programming. Soviet Mathematics Doklady20 (1979), 191–194.

    Google Scholar 

  10. Mansfield, A. J. Comparison of perceptron training by linear programming and by the perceptron convergence procedure. In Proceedings of the International Joint Conference on Neural Networks (Seattle, WA, 1991), pp. II-25–II-30.

    Google Scholar 

  11. Gallant, S. I. Perceptron-based learning algorithms. IEEE Transactions on Neural Networks1 (1990), 179–191.

    Google Scholar 

  12. Mézard, M., AND Nadal, J.-P. Learning in feedforward layered networks: The tiling algorithm. Journal of Physics A22 (1989), 2191–2203.

    Google Scholar 

  13. Frean, M. The upstart algorithm: A method for constructing and training feed-forward neural networks. Neural Computation2 (1990), 198–209.

    Google Scholar 

  14. Muselli, M. On sequential construction of binary neural networks. To appear on IEEE Transactions on Neural Networks.

    Google Scholar 

  15. Frean, M. A “thermal” perceptron learning rule. Neural Computation4 (1992), 946–957.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Paul Vitányi

Rights and permissions

Reprints and permissions

Copyright information

© 1995 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Muselli, M. (1995). Is pocket algorithm optimal?. In: Vitányi, P. (eds) Computational Learning Theory. EuroCOLT 1995. Lecture Notes in Computer Science, vol 904. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-59119-2_185

Download citation

  • DOI: https://doi.org/10.1007/3-540-59119-2_185

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-59119-1

  • Online ISBN: 978-3-540-49195-8

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics