Skip to main content

PAN RAM Bootstrapping Regressor - A New RAM-Based Architecture for Regression Problems

  • Conference paper
  • First Online:
Intelligent Systems (BRACIS 2022)

Abstract

RAM-based neural networks have been used a few decades before MultiLayer Perceptrons. Despite their implementability in hardware and speed, they have some drawbacks compared to more recent techniques, particularly related to continuous input variables, which kept them out of mainstream research. About a decade ago, the PAN RAM was an attempt to handle continuous inputs with a polynomial approximator neuron in the n-tuple architecture applied to binary decision problems. Constraints on applications and data preparation still remained. This paper presents an evolution of the PAN RAM neuron that can do regression and a single-layer architecture that is dynamically built based on bootstrapping. The proposed system was benchmarked against the Multilayer Perceptron on three regression problems using the Abalone, White Wine, and California Housing datasets from the UCI repository. In the unicaudal paired t-test carried out on a 10-fold cross-validation comparison measuring the Mean Absolute Error (MAE), the proposed system performed better than the Multilayer Perceptron (MLP) on the abalone and white wine datasets. In contrast, for the California dataset there was no significant improvement, all at a 0.05 significance level.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bledsoe, W.W., Browning, I.: Pattern recognition and reading by machine. In: Proceedings of The Eastern Joint Computer Conference, pp. 225–232 (1959)

    Google Scholar 

  2. Rumelhart, D.E., Hinton, G.E., Williams, G.E.: Learning internal representations by error propagation. In: Rumelhart, D.E., McClelland, J.L. (eds.) Parallel Distributed Processing, vol. 1. MIT Press, Cambridge (1986)

    Chapter  Google Scholar 

  3. Adeodato, P. J. L., Oliveira Neto, R. F.: Polynomial approximation RAM neuron capable of handling true continuous input variables. In: International Joint Conference on Neural Networks (IJCNN), pp. 76–83 (2016). https://doi.org/10.1109/IJCNN.2016.7727183.4

  4. Dua, D., Graff, C.: UCI Machine Learning Repository]. School of Information and Computer Science, University of California, Irvine, CA (2019). http://archive.ics.uci.edu/ml

  5. Myers, C., Aleksander, I.: Learning algorithms for probabilistic neural nets. Neural Netw. 1(1 SUPPL), 205 (1988). https://doi.org/10.1016/0893-6080(88)90242-0

    Article  Google Scholar 

  6. Adeodato, P.J.L., Oliveira Neto, R. F.: pRAM n-tuple Classifier-a new architecture of probabilistic RAM neurons for classification problems. In: The 2010 International Joint Conference on Neural Networks (IJCNN), pp. 1–7 (2010)

    Google Scholar 

  7. Efron, B., Tibshirani, R.J.: An Introduction to the Bootstrap. CRC Press, Boca Raton (1994)

    Book  MATH  Google Scholar 

  8. Dua, D., Graff, C.: UCI machine learning repository: wine quality data set. https://archive.ics.uci.edu/ml/datasets/wine+quality. Accessed 25 Apr 2022

  9. Dua, D., Graff, C..: UCI machine learning repository: Abalone data set. https://archive.ics.uci.edu/ml/datasets/Abalone. Accessed 25 Apr 2022

  10. Pace, R.K., Ronald, B.: Sparse spatial autoregressions. Stat. Probab. Lett. 33(3), 291–297 (1997)

    Article  MATH  Google Scholar 

  11. Cortez, P., Cerdeira, A., Almeida, F., Matos, T., Reis. J.: Modeling wine preferences by data mining from physicochemical properties. In: Decision Support Systems, pp. 547–553. Elsevier (2009)

    Google Scholar 

  12. Nakano, R., Satoh, S.: Mixture of multilayer perceptron regressions. In: ICPRAM, pp. 509–516 (2019)

    Google Scholar 

  13. Jung, Y.: Multiple predicting K-fold cross-validation for model selection. J. Nonparametr. Stat. 30(1), 197–215 (2018). https://doi.org/10.1080/10485252.2017.1404598

    Article  MathSciNet  MATH  Google Scholar 

  14. Arlot, S., Celisse, A.: A survey of cross validation procedures for model selection. Stat. Surv. 4, 40–79 (2010). https://doi.org/10.1214/09-SS054

    Article  MathSciNet  MATH  Google Scholar 

  15. Chai, T., Draxler, R.R.: Root mean square error (RMSE) or mean absolute error (MAE)? - arguments against avoiding RMSE in the literature. Geosci. Model Dev. 7(3), 1247–1250 (2014)

    Article  Google Scholar 

  16. Rynkiewicz, J.: General bound of overfitting for MLP regression models. Neurocomputing 90, 106–110 (2012). https://doi.org/10.1016/j.neucom.2011.11.028

    Article  Google Scholar 

  17. Alexandre, B.: fdm2id: Data Mining and R Programming for Beginners. R package version 0.9.5. https://CRAN.R-project.org/package=fdm2id. Accessed 25 Apr 2022

  18. Turian, J., Bergstra, J., Bengio, Y.: Quadratic features and deep architectures for chunking. In: Proceedings of Human Language Technologies: The 2009 Annual Conference of the North American Chapter of the Association for Computational Linguistics, pp. 245–248 (2009)

    Google Scholar 

  19. Venables, W.N., Ripley, B.D.: Modern Applied Statistics with S, 4th edn. Springer, New York (2002). https://doi.org/10.1007/978-0-387-21706-2

    Book  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Starch Melo de Souza .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

de Souza, S.M., de Lima, K.P., da Cunha Carneiro Lins, A.J., de Brito, A.F.Q., Adeodato, P.J.L. (2022). PAN RAM Bootstrapping Regressor - A New RAM-Based Architecture for Regression Problems. In: Xavier-Junior, J.C., Rios, R.A. (eds) Intelligent Systems. BRACIS 2022. Lecture Notes in Computer Science(), vol 13654 . Springer, Cham. https://doi.org/10.1007/978-3-031-21689-3_40

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-21689-3_40

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-21688-6

  • Online ISBN: 978-3-031-21689-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics