Skip to main content
Log in

Design optimization with back-propagation neural networks

  • Papers
  • Published:
Journal of Intelligent Manufacturing Aims and scope Submit manuscript

Abstract

A methodology with back-propagation neural network models is developed to explore the artificial neural nets (ANN) technology in the new application territory of design optimization. This design methodology could go beyond the Hopfield network model, Hopfield and Tank (1985), for combinatorial optimization problems In this approach, pattern classification with back-propagation network, the most demonstrated power of neural networks applications, is utilized to identify the boundaries of the feasible and the infeasible design regions. These boundaries enclose the multi-dimensional space within which designs satisfy all design criteria. A feedforward network is then incorporated to perform function approximation of the design objective function. This approximation is performed by training the feedforward network with objective functions evaluated at selected design sets in the feasible design regions. Additional optimum design sets in the classified feasible regions are calculated and included in the successive training sets to improve the function mapping. Iteration is continued until convergent criteria are satisfied. This paper demonstrates that the artificial neural nets technology provides a global perspective of the entire design space with good and near optimal solutions. ANN can indeed be a potential technology for design optimization.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Abe, S. and Kawakami, J. (1990) Theories on the Hopfield neural networks with inequality constraints.International Joint Conference on Neural Networks-90,1, 349–52.

    Google Scholar 

  • Akiyama, A., Yamashita, A., Kajiura, M. and Aiso, H. (1989) Combinatorial optimization with Gaussian machines.International Joint Conference on Neural Networks-89,1, 533–40.

    Google Scholar 

  • Battiti, R. (1990) Optimization methods for back-propagation: automatic parameter tuning and faster convergence.International Joint Conference on Neural Networks-90,1, 593–6.

    Google Scholar 

  • Carpenter, G. A. and Grossberg, S. (1987) A massively parallel architecture for a self-organizing neural pattern recognition machine.Computer Vision, Graphics and Image Processing,37, 54–115.

    Google Scholar 

  • Chen, H. and Lee, S. J. (1990) Optimization search using neural networks.International Joint Conference on Neural Networks-90,2, 503–6.

    Google Scholar 

  • DARPA (1988)DARPA Neural Network Study, AFCEA International Press.

  • Hecht-Nielsen, R. (1987) Kolmogorov's mapping neural network existence theorem.IEEE International Conference on Neural Networks-87,3 11–4.

    Google Scholar 

  • Hecht-Nielsen, R. (1989) Theory of the backpropagation neural network.International Joint Conference on Neural Networks-89,1, 593–605.

    Google Scholar 

  • Hinton, G. E., Sejnowski, T. J. and Ackley, D. H. (1984) Boltzmann machines: constraint satisfaction networks that learn,CMU-CS-84-119.

  • Hopfield, J. J. and Tank, D. W. (1985) Neural computation of decisions in optimization problems.Biological Cybernetics,52, 141–52.

    Google Scholar 

  • Irie, B. and Miyake, S. (1987) Capabilities of three-layered perceptrons.IEEE International Conference on Neural Networks-88,1, 641–8.

    Google Scholar 

  • Kolmogorov, A. N. (1957) On the representation of continuous functions of many variables by superposition of continuous functions of one variable and addition.Dokl. Akad. Nauk USSR,114, 953–6.

    Google Scholar 

  • Lippmann, R. P. (1987) An introduction to computing with neural nets.IEEE ASSP Magazine,April, 4–22.

    Google Scholar 

  • Lippmann, R. P. (1989) Review of neural networks for speech recognition.Neural Computation,1, 1–38.

    Google Scholar 

  • Ramanujam, J. and Sadayappan, P. (1988) Optimization by neural networks.IEEE International Conference on Neural Networks-88,2, 325–32.

    Google Scholar 

  • Rumelhart, D. E., Hinton, G. E. and Williams, R. J. (1986) Learning Internal Representations by Error Propagation, inParallel Distributed Processing, Vol. I, Rumelhart, D. E. and McClelland J. L. (eds). MIT Press, Cambridge, MA, pp. 318–62.

    Google Scholar 

  • Szu, H. (1987) Fast simulated annealing.Physics Letters A,122, 157–62.

    Google Scholar 

  • Vanderplaats, G. N. (1984)Numerical optimization Techniques for Engineering Design: with Applications, McGraw-Hill Book Company, New York.

    Google Scholar 

  • Vanderplaats, G. N. (1985)COPES/ADS — A Fortran Control Program for Engineering Synthesis Using the ADS Optimization Program, Engineering Design Optimization, Inc.

  • Werbos, P. J. (1987) Building and understanding adaptive systems: a statistical/numerical approach to factory automation and brain research.IEEE Trans. on Systems, Man and Cyber.,SMC-17, 7–20.

    Google Scholar 

  • Werbos, P. J. (1988) Backpropagation: past and future,IEEE International Conference on Neural Networks-88,1, 343–53.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lee, SJ., Chen, H. Design optimization with back-propagation neural networks. J Intell Manuf 2, 293–303 (1991). https://doi.org/10.1007/BF01471177

Download citation

  • Received:

  • Accepted:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF01471177

Keywords

Navigation