Loading [a11y]/accessibility-menu.js
Minimum precision requirements for the SVM-SGD learning algorithm | IEEE Conference Publication | IEEE Xplore

Minimum precision requirements for the SVM-SGD learning algorithm


Abstract:

It is well-known that the precision of data, weight vector, and internal representations employed in learning systems directly impacts their energy, throughput, and laten...Show More

Abstract:

It is well-known that the precision of data, weight vector, and internal representations employed in learning systems directly impacts their energy, throughput, and latency. The precision requirements for the training algorithm are also important for systems that learn on-the-fly. In this paper, we present analytical lower bounds on the precision requirements for the commonly employed stochastic gradient descent (SGD) on-line learning algorithm in the specific context of a support vector machine (SVM). These bounds are obtained subject to desired system performance. These bounds are validated using the UCI breast cancer dataset. Additionally, the impact of these precisions on the energy consumption of a fixed-point SVM with on-line training is studied. Simulation results in 45 nm CMOS process show that operating at the minimum precision as dictated by our bounds improves energy consumption by a factor of 5.3× as compared to conventional precision assignments with no observable loss in accuracy.
Date of Conference: 05-09 March 2017
Date Added to IEEE Xplore: 19 June 2017
ISBN Information:
Electronic ISSN: 2379-190X
Conference Location: New Orleans, LA, USA

Contact IEEE to Subscribe

References

References is not available for this document.