Abstract
RAM-based neural networks have been used a few decades before MultiLayer Perceptrons. Despite their implementability in hardware and speed, they have some drawbacks compared to more recent techniques, particularly related to continuous input variables, which kept them out of mainstream research. About a decade ago, the PAN RAM was an attempt to handle continuous inputs with a polynomial approximator neuron in the n-tuple architecture applied to binary decision problems. Constraints on applications and data preparation still remained. This paper presents an evolution of the PAN RAM neuron that can do regression and a single-layer architecture that is dynamically built based on bootstrapping. The proposed system was benchmarked against the Multilayer Perceptron on three regression problems using the Abalone, White Wine, and California Housing datasets from the UCI repository. In the unicaudal paired t-test carried out on a 10-fold cross-validation comparison measuring the Mean Absolute Error (MAE), the proposed system performed better than the Multilayer Perceptron (MLP) on the abalone and white wine datasets. In contrast, for the California dataset there was no significant improvement, all at a 0.05 significance level.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bledsoe, W.W., Browning, I.: Pattern recognition and reading by machine. In: Proceedings of The Eastern Joint Computer Conference, pp. 225–232 (1959)
Rumelhart, D.E., Hinton, G.E., Williams, G.E.: Learning internal representations by error propagation. In: Rumelhart, D.E., McClelland, J.L. (eds.) Parallel Distributed Processing, vol. 1. MIT Press, Cambridge (1986)
Adeodato, P. J. L., Oliveira Neto, R. F.: Polynomial approximation RAM neuron capable of handling true continuous input variables. In: International Joint Conference on Neural Networks (IJCNN), pp. 76–83 (2016). https://doi.org/10.1109/IJCNN.2016.7727183.4
Dua, D., Graff, C.: UCI Machine Learning Repository]. School of Information and Computer Science, University of California, Irvine, CA (2019). http://archive.ics.uci.edu/ml
Myers, C., Aleksander, I.: Learning algorithms for probabilistic neural nets. Neural Netw. 1(1 SUPPL), 205 (1988). https://doi.org/10.1016/0893-6080(88)90242-0
Adeodato, P.J.L., Oliveira Neto, R. F.: pRAM n-tuple Classifier-a new architecture of probabilistic RAM neurons for classification problems. In: The 2010 International Joint Conference on Neural Networks (IJCNN), pp. 1–7 (2010)
Efron, B., Tibshirani, R.J.: An Introduction to the Bootstrap. CRC Press, Boca Raton (1994)
Dua, D., Graff, C.: UCI machine learning repository: wine quality data set. https://archive.ics.uci.edu/ml/datasets/wine+quality. Accessed 25 Apr 2022
Dua, D., Graff, C..: UCI machine learning repository: Abalone data set. https://archive.ics.uci.edu/ml/datasets/Abalone. Accessed 25 Apr 2022
Pace, R.K., Ronald, B.: Sparse spatial autoregressions. Stat. Probab. Lett. 33(3), 291–297 (1997)
Cortez, P., Cerdeira, A., Almeida, F., Matos, T., Reis. J.: Modeling wine preferences by data mining from physicochemical properties. In: Decision Support Systems, pp. 547–553. Elsevier (2009)
Nakano, R., Satoh, S.: Mixture of multilayer perceptron regressions. In: ICPRAM, pp. 509–516 (2019)
Jung, Y.: Multiple predicting K-fold cross-validation for model selection. J. Nonparametr. Stat. 30(1), 197–215 (2018). https://doi.org/10.1080/10485252.2017.1404598
Arlot, S., Celisse, A.: A survey of cross validation procedures for model selection. Stat. Surv. 4, 40–79 (2010). https://doi.org/10.1214/09-SS054
Chai, T., Draxler, R.R.: Root mean square error (RMSE) or mean absolute error (MAE)? - arguments against avoiding RMSE in the literature. Geosci. Model Dev. 7(3), 1247–1250 (2014)
Rynkiewicz, J.: General bound of overfitting for MLP regression models. Neurocomputing 90, 106–110 (2012). https://doi.org/10.1016/j.neucom.2011.11.028
Alexandre, B.: fdm2id: Data Mining and R Programming for Beginners. R package version 0.9.5. https://CRAN.R-project.org/package=fdm2id. Accessed 25 Apr 2022
Turian, J., Bergstra, J., Bengio, Y.: Quadratic features and deep architectures for chunking. In: Proceedings of Human Language Technologies: The 2009 Annual Conference of the North American Chapter of the Association for Computational Linguistics, pp. 245–248 (2009)
Venables, W.N., Ripley, B.D.: Modern Applied Statistics with S, 4th edn. Springer, New York (2002). https://doi.org/10.1007/978-0-387-21706-2
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
de Souza, S.M., de Lima, K.P., da Cunha Carneiro Lins, A.J., de Brito, A.F.Q., Adeodato, P.J.L. (2022). PAN RAM Bootstrapping Regressor - A New RAM-Based Architecture for Regression Problems. In: Xavier-Junior, J.C., Rios, R.A. (eds) Intelligent Systems. BRACIS 2022. Lecture Notes in Computer Science(), vol 13654 . Springer, Cham. https://doi.org/10.1007/978-3-031-21689-3_40
Download citation
DOI: https://doi.org/10.1007/978-3-031-21689-3_40
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-21688-6
Online ISBN: 978-3-031-21689-3
eBook Packages: Computer ScienceComputer Science (R0)