Loading web-font TeX/Main/Regular
17.2 A 142nW Voice and Acoustic Activity Detection Chip for mm-Scale Sensor Nodes Using Time-Interleaved Mixer-Based Frequency Scanning | IEEE Conference Publication | IEEE Xplore

17.2 A 142nW Voice and Acoustic Activity Detection Chip for mm-Scale Sensor Nodes Using Time-Interleaved Mixer-Based Frequency Scanning


Abstract:

Acoustic sensing is one of the most widely used sensing modalities to intelligently assess the environment. In particular, ultra-low power (ULP) always-on voice activity ...Show More

Abstract:

Acoustic sensing is one of the most widely used sensing modalities to intelligently assess the environment. In particular, ultra-low power (ULP) always-on voice activity detection (VAD) is gaining attention as an enabling technology for IoT platforms. In many practical applications, acoustic events-of-interest occur infrequently. Therefore, the system power consumption is typically dominated by the always-on acoustic wakeup detector, while the remainder of the system is power-gated the vast majority of the time. A previous acoustic wakeup detector [1] consumed just 12nW but could not process voice signals (up to 4kHz bandwidth) or handle non-stationary events, which are essential qualities for a VAD. Prior VAD ICs [2], [3] demonstrated reliable performance but consumed significant power (\gt 20 \mu \mathrm {W}) and lacked an analog frontend (AFE), which further increases power. Recent analog-domain feature extraction-based VADs [4], [5] also reported \mu \mathrm {W}- level power consumption, and their simple decision tree [4] or fixed neural network-based approach [5] limited broader use for various acoustic event targets. In summary, no sub -\mu \mathrm {W} VAD has been reported to date, preventing the use of VADs in unobtrusive mm-scale sensor nodes.
Date of Conference: 17-21 February 2019
Date Added to IEEE Xplore: 07 March 2019
ISBN Information:

ISSN Information:

Conference Location: San Francisco, CA, USA

References

References is not available for this document.