Abstract
In implementing digital signal processing (DSP) algorithms for audio real-time applications, one is frequently faced with problems regarding incompatibilities between the hardware buffer length (the internal buffer of a professional sound card) and the software buffer size imposed by the underlying algorithm (due to i.e. multirate or FFT constraints). This mismatch is solved by proper frame size conversion algorithms which inevitably introduce delay. In this context, this paper presents a buffering scheme together with a theoretical proof of the minimum delay property shown by it. Some examples derived from frequently encountered issues in DSP applications are reported.
Similar content being viewed by others
References
Knuth D.: The Art of Computer Programming, Volume 1: Fundamental Algorithms. Addison Wesley, Reading (1997)
Niven I., Zuckerman H.: An Introducion to the Theory of Numbers. Wiley, New York (1980)
Steinberg.: Audio Streaming Input/Output Software Developer Kit 2.2. Steinberg Media Technologies GmbH (2005)
Vaidyanathan PP.: Multirate Systems and Filterbanks. Prentice-Hall, Englewood Cliffs (1993)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Moretti, E., Peretti, P., Palestini, L. et al. A theoretical analysis of a buffer frame size conversion algorithm for audio applications ensuring minimum latency. SIViP 5, 185–190 (2011). https://doi.org/10.1007/s11760-010-0153-0
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11760-010-0153-0