Skip to main content

Open Access Optimization of Softmax Layer in Deep Neural Network Using Integral Stochastic Computation

Deep neural network (DNN), as a very important machine learning technique in classification and detection tasks for images, video, speech as well as audio, has recently received tremendous attention. Integral Stochastic Computation (Integral SC), on the other hand, has proved its extraordinary ability in hardware implementation of DNNs. The softmax layer is generally used in multiclassification tasks as a very basic and important network layer in DNNs. However, the hardware implementation of softmax layer is expensive because of the exponentiation and division computation. In this paper, we designed an efficient way to simulate softmax layer in DNNs based on Integral stochastic computing, filling the vacancy of previous academic works. Compared to conventional softmax hardware implementation, our method achieves reduction in power and area by 63% and 68%, respectively.

Keywords: DEEP NEURAL NETWORK; INTEGRAL STOCHASTIC COMPUTATION; SOFTMAX LAYER

Document Type: Research Article

Publication date: 01 December 2018

More about this publication?
  • The electronic systems that can operate with very low power are of great technological interest. The growing research activity in the field of low power electronics requires a forum for rapid dissemination of important results: Journal of Low Power Electronics (JOLPE) is that international forum which offers scientists and engineers timely, peer-reviewed research in this field.
  • Editorial Board
  • Information for Authors
  • Subscribe to this Title
  • Terms & Conditions
  • Ingenta Connect is not responsible for the content or availability of external websites
  • Access Key
  • Free content
  • Partial Free content
  • New content
  • Open access content
  • Partial Open access content
  • Subscribed content
  • Partial Subscribed content
  • Free trial content