Abstract
The role of activation functions in feedforward artificial neural networks has not been investigated to the desired extent. The commonly used sigmoidal functions appear as discrete points in the sigmoidal functional space. This makes comparison difficult. Moreover, these functions can be interpreted as the (suitably scaled) integral of some probability density function (generally taken to be symmetric/bell shaped). Two parameterization methods are proposed that allow us to construct classes of sigmoidal functions based on any given sigmoidal function. The suitability of the members of the proposed class is investigated. It is demonstrated that all members of the proposed class(es) satisfy the requirements to act as an activation function in feedforward artificial neural networks.
Similar content being viewed by others
References
Barron, A. R.: Universal approximation bounds for superposition of a sigmoidal function, IEEE Transactions on Information Theory 39 (1993), 930–945.
Chen, T., Chen, H. and Liu, R. W.: Approximation capability in C(194-1) by multilayer feedforward networks and related problems, IEEE Transactions on Neural Networks 6(1) (1995), 25–30.
Cheney, W. and Light, W.: A Course in Approximation Theory. Brooks/Cole Publishing Company, Pacific Grove, CA, 2000.
Chui, C. K. and Li, X.: Approximation by ridge functions and neural networks with one hidden layer, Journal of Approximation Theory 70 (1992), 131–141.
Cybenko, G.: Approximation by superposition of a sigmoidal function, Mathematics of Control, Signal and Systems 5(2–3) (1989), 233–243.
Funahashi, K.: On the approximate realization of continuous mappings by neural networks, Neural Networks 2 (1989), 183–192.
Gallant, A. R. and White, H.: There exists a neural network that does not make avoidable mistakes, In: Proceedings of the Second International Joint Conference on Neural Networks I: 593–606, San Diego: SOS Printing (1988).
Hornik, K., Stinchcombe, M. and White, H.: Multilayer feedforward networks are universal approximators, Neural Networks 2 (1989), 359–366.
Hornik, K., Stinchcombe, M. and White, H.: Universal approximation of an unknown mapping and its derivatives using multilayer feedforward networks, Neural Networks 3 (1990), 551–560.
Pinkus, A.: Approximation theory of the MLP model in neural networks, Acta Numerica (1999), 143–195.
Scarselli, F. and Tsoi, A. C.: Universal approximation using feedforward neural networks: a survey of some existing methods and some new results, Neural Networks 11 (1998), 15–37.
Cherkassky, V. and Mulier, F.: Learning From Data – Concepts, Theory, and Methods. John Wiley & Sons, New York, 1998.
Haykin, S.: Neural Networks – A Comprehensive Foundation. Prentice Hall, Singapore, 1999.
Nguyen, D. and Widrow, B.: Improving the Learning Speed of 2-Layer Neural Networks by Choosing Initial Values of the Adaptive Weights, In: Proceedings of the International Joint Conference on Neural Networks, III: 21–26, San Diego: SOS Printing (1990).
Yam, J. Y. F. and Chow, T. W. S.: Feedforward network training speed enhancement by optimal initialisation of the synaptic coefficients, IEEE Transactions on Neural Networks, 12 (2001), 430–434.
LeCun, Y., B. Battou, Orr, G. B. and Muller, K.-R.: Efficient backprop, In: G. Orr and K. Muller (eds), Neural Networks: Tricks of the Trade, pp. 5–50, Berlin, Springer-Verlag, 1998.
Duch, W. and Jankowski, N.: Survey of neural transfer functions, Neural Computing Surveys 2 (1999), 163–212, http://www.icsi.berkeley.edu/~jagota/NCS.
Kolen, J. F. and Pollock. J. B.: Back Propagation is Sensitive to Initial Conditions. Technical Report TR 90-JK-BPSIC, Computer and Information Science Department, The Ohio State University, Ohio, 1990.
Singh, Y. and Chandra, P.: A class+1 sigmoidal activation functions for FFANNs, Journal of Economic Dynamics and Control 28(1) (2003), 183–187.
Jost, J.: Postmodern Analysis. Springer-Verlag, Berlin, 1998.
Abramowitz, M. and Stegun, I. E.: (eds.), Handbook of Mathematical Functions, Dover Publications, Inc., New York, 1965.
Sneddon, I. N.: Special Functions of Mathematical Physics and Chemistry, Oliver and Boyd, Edinburgh, 1966.
Trentin, T.: Networks with trainable amplitude of activation functions, Neural Networks 14 (2001), 471–493.
Yamada, T. and Yabuta, T.: Neural network controller using auto-tuning method for nonlinear functions, IEEE Transactions on Neural Networks 3 (1992), 595–601.
Chen, C. T. and Chang, W. D.: A feedforward neural network with function shape autotuning, Neural Networks 9(4) (1996), 627–641.
Hu, Z. and Shao, H.: The study of neural adaptive control systems, Control and Decision 7 (1992), 361–366.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Chandra, P. Sigmoidal Function Classes for Feedforward Artificial Neural Networks. Neural Processing Letters 18, 205–215 (2003). https://doi.org/10.1023/B:NEPL.0000011137.04221.96
Issue Date:
DOI: https://doi.org/10.1023/B:NEPL.0000011137.04221.96