Skip to main content

Comparison of Neural Networks Incorporating Partial Monotonicity by Structure

  • Conference paper
Artificial Neural Networks - ICANN 2008 (ICANN 2008)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5164))

Included in the following conference series:

  • 2426 Accesses

Abstract

Neural networks applied in control loops and safety-critical domains have to meet more requirements than just the overall best function approximation. On the one hand, a small approximation error is required, on the other hand, the smoothness and the monotonicity of selected input-output relations have to be guaranteed. Otherwise the stability of most of the control laws is lost. Three approaches for partially monotonic models are compared in this article, namely Bounded Derivative Network (BDN) [1], Monotonic Multi-Layer Perceptron Network (MONMLP) [2], and Constrained Linear Regression (CLR). Authors investigated the advantages and disadvantages of these approaches related to approximation performance, training of the model and convergence.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 139.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Turner, P., Guiver, J., Brian, L.: Introducing The State Space Bounded Derivative Network For Commercial Transition Control. In: Proceedings of the American Control Conference, Denver, Colorado, June 4-6 (2003)

    Google Scholar 

  2. Lang, B.: Monotonic Multi-layer Perceptron Networks as Universal Approximators. In: Duch, W., Kacprzyk, J., Oja, E., Zadrożny, S. (eds.) ICANN 2005. LNCS, vol. 3697, pp. 31–37. Springer, Heidelberg (2005)

    Google Scholar 

  3. Zhang, H., Zhang, Z.: Feed forward networks with monotone constraints. In: IEEE International Joint Conference on Neural Networks IJCNN 1999, Washington, DC, USA, vol. 3, pp. 1820–1823 (1999)

    Google Scholar 

  4. Sill, J.: Monotonic Networks, Advances in Neural Information Processing Systems, Cambridge, MA, vol. 10, pp. 661–667 (1998)

    Google Scholar 

  5. Sill, J., Abu-Mostafa, Y.S.: Monotonicity hints, Advances in Neural Information Processing Systems, Cambridge, MA, vol. 9, pp. 634–640 (1997)

    Google Scholar 

  6. Kay, H., Ungar, L.H.: Estimating monotonic functions and their bounds. AIChE J. 46, 2426

    Google Scholar 

  7. Tarca, L.A., Grandjean, B.P.A., Larachi, F.: Embedding monotonicity and concavity information in the training of multiphase flow neural network correlations by means of genetic algorithms. Computers and Chemical Engineering 28(9), 1701–1713 (2004)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Véra Kůrková Roman Neruda Jan Koutník

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Minin, A., Lang, B. (2008). Comparison of Neural Networks Incorporating Partial Monotonicity by Structure. In: Kůrková, V., Neruda, R., Koutník, J. (eds) Artificial Neural Networks - ICANN 2008. ICANN 2008. Lecture Notes in Computer Science, vol 5164. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-87559-8_62

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-87559-8_62

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-87558-1

  • Online ISBN: 978-3-540-87559-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics