Abstract
We provide three steps in the direction of shifting probability from a descriptive tool of unpredictable events to a way of understanding them. At a very elementary level we state an operational definition of probability based solely on symmetry assumptions about observed data. This definition converges, however, to the Kolmogorov one within a special large number law fashion that represents a first way of twisting features observed in the data with properties expected in the next observations. Within this probability meaning we fix a general sampling mechanism to generate random variables and extend our twisting device to computing probability distributions on population properties on the basis of the likelihood of the observed features. Here the randomness core translates from the above symmetry assumptions in a generator of unitary uniform random variables. Willing discovering suitable features (which are classically defined as sufficient statistics), we refer directly to the notions of Kolmogorov complexity and coding theorem in particular. This is to connect the features to the inner structure of the observed data in terms of concise computer codes describing them in a well equipped computational framework.
This new statistical framework allows us to recover and improve results on computational learning at both subsymbolic and symbolic stages, figuring a unique shell where the full trip from sensory data to their conceptual management might occur.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
S. S. Wilks, Mathematical statistics, Wiley publications in statistics, John Wiley, New York, London, 1965.
V. K. Rohatgi, An Introduction to Probability Theory and Mathematical Statistics, Wiley series in probability and mathematical statistics, John Wiley & Sons, New York, Chichester, Brisbane, Toronto, Singapore, 1976.
L. Valiant, A theory of the learnable, Communications of the ACM 27 (11) (1984) 1134–1142.
B. Apolloni, S. Chiaravalli, Pac learning of concept classes through the boundaries of their items, Journal of Theoretical Computer Science 172 (1997) 91–120.
B. Apolloni, D. Malchiodi, Gaining degrees of freedom in subsymbolic learning, Journal of Theoretical Computer Science 255 (2001) 295–391.
A. Blumer, A. Ehrenfreucht, D. Haussler, M. Warmuth, Learnability and the vapnik-chervonenkis dimension, Journal of the ACM 36 (1989) 929–965.
B. Apolloni, D. Iannizzi, D. Malchiodi, Algorithmically inferring functions, Tech. rep., Università degli Studi di Milano (2000).
S. Zacks, The Theory of Statistical Inference, Wiley series in probability and mathematical statistics, John Wiley & Sons, New York, London, Sydney, Toronto, 1971.
S. Martello, P. Toth, The 0–1 knapsack problem, in: Combinatorial Optimization, Wiley, 1979, pp. 237–279.
M. Li, P. Vitànyi, An Introduction to Kolmogorov Complexity and its Applications, Springer, Berlin, 1993.
A. Church, Introduction to Mathematical Logic I, Vol. 13 of Annals of • Mathematics Studies, Princeton University Press, Princeton, NJ, 1944.
H. Roger, Theory of recoursive functions and effective computability, Mc Graw-Hill, 1967.
V. Vapnik, The Nature of Statistical Learning Theory, Springer, New York, 1995.
B. Apolloni, D. Malchiodi, C. Orovas, G. Palmas, Prom synapses to rules, in: Foundations of Connectionist-symbolic Integration: Representation, Paradigms, and Algorithms — Proceedings of the 14th European Conference on Artificial Intelligence, 2000.
M. Hilario, An overview of strategies for neurosymbolic integration, in: F. Alexandre (Ed.), Connectionist-Symbolic Processing: From Unified to Hybrid Approaches, Lawrence Erlbaum, 1998.
R. Sun, Integrating rules and connectionism for robust commonsense reasoning, Wiley, New York, 1994.
D. E. Rumelhart, G. E. Hinton, R. J. Williams, Learning internal representations by error propagation, in: Parallel Distributed Processing, Vol. 1, MIT Press, Cambridge, Massachusstes, 1986.
J. Quinlan, Comparing connectionist and symbolic learning methods, in: Computational Learning Theory and Natural Learning Systems. Volume I. Constraints and Prospects, MIT Press, Cambridge, 1994, pp. 445–456.
W. A. Fellenz, G. J. Taylor, C. R., E. Douglas-Cowie, F. Piat, S. Kollias, C. Orovas, B. Apolloni, On emotion recognition of faces and speech using neural networks, fuzzy logic and the assess system, in: S. Amari, C. Lee Giles, M. Gori, P. V. (Eds.), Proceeding of the IEEE-INNS-ENNS International Joint Conference on Neural Networks - IJCNN 2000, IEEE Computer Society, Los Alamitos, 2000, pp. II–93, II–98.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag London Limited
About this paper
Cite this paper
Apolloni, B., Malchiodi, D., Zoppis, I., Gaito, S. (2002). Twisting Features with Properties. In: Tagliaferri, R., Marinaro, M. (eds) Neural Nets WIRN Vietri-01. Perspectives in Neural Computing. Springer, London. https://doi.org/10.1007/978-1-4471-0219-9_33
Download citation
DOI: https://doi.org/10.1007/978-1-4471-0219-9_33
Publisher Name: Springer, London
Print ISBN: 978-1-85233-505-2
Online ISBN: 978-1-4471-0219-9
eBook Packages: Springer Book Archive